Friday, November 30, 2007

Infoworld reports Cost of data breaches keeps rising

Infoworld has an article supporting my contention that as more in more data is put online the greater the impact of security data breaches will be. Check out the article

Infoworld - Cost of data breaches keeps rising, By Matt Hine

Organizations that experience data breaches are paying more than ever to recover from the incidents and retain customers once the events become public knowledge, according to a new research report.

In its third annual study into the financial impact of data breaches, Ponemon Institute reports the episodes are costing an average of $197 per lost or stolen customer record during 2007, a slight bump from the figure of $182 per exposed record that it tracked one year ago, and a significant gain over the estimate of $128 per record that the research firm published in its initial 2005 survey.

Based on interviews conducted with 35 organizations that experienced data incidents in the last year, the Ponemon study found the average total cost of the breaches rose to $6.3 million in 2007, compared to an average of $4.8 million in 2006.

The average number of records exposed in the breaches Ponemon studied was roughly 20,000 per incident, although among those organizations surveyed the incidents ranged from as few as 4,000 records to more than 125,000 records.

One of the issues driving the continued escalation of data breach remediation expenses, according to the research, is the estimated cost of lost business and so-called customer churn that results from notification of the episodes.

Ponemon said the impact of customer churn and the amount of money necessary to acquire new business grew to an average of $128 per misplaced or stolen record in 2007, roughly a 30 percent increase over the figure of $98 per record published in last year's study.

Among the other expenses facing organizations that suffer data incidents are monies spent to provide customer support and credit monitoring services to affected individuals, along with budgets allocated for advertising and marketing efforts aimed at repairing companies' public images.

However, while the cost of breaches has continued to rise, the research company contends that many organizations have improved their ability to react to the events.

Even those firms who have experienced repeated incidents seem to have woken up to the idea of planning ahead and trying to prepare for the situation, said Dr. Larry Ponemon, the research firm's founder and chairman.

"We're finding that organizations seem to be better prepared with response plans, especially those that have experienced more than one breach, which was more than half of the people we interviewed," Ponemon said. "Companies have found that some of the tactics they used several years ago to respond didn't work out like they had imagined, and they're trying to find the most effective ways to retain customers while driving down costs."

For instance, said Ponemon, many companies affected by breaches in prior years attempted to curry favor with their customers by handing out gift coupons for their products or services.

Sphere: Related Content

Telecommuting: Are You One Of Lucky 23 Out Of Every 100

I was lucky enough to be amongst the first of many AT&T employees that have since been allowed to telecommute regularly. Way back in 1990, I gave up my cubical for a home office, and only made the trip in to the office once or twice a week for staff meetings.

I quickly found I loved the freedom it brought me and my supervisors noted that I was much more productive without the constant interruptions that office work brings.

But since I've ventured out into the consulting world, while I still get to do a portion of my work from home, most of my work needs to be done at the client site.

I miss my mornings at home, and I'm not alone, read PC World - Telecommuting: A Quarter of U.S. Workers Do It Regularly:

A survey released Tuesday by Citrix Online found that 23 percent of American workers regularly do their jobs from someplace besides the office, and that 62 percent of respondents who cannot work off-site would like to.

The survey also found that workers prized the ability to telecommute more highly than stock options or on-site child care.

Mike Amble, a former senior vice president of operations and engineering for a major American financial company, said telecommuting fits well into the mission of corporate IT departments. The most skilled employees may not live in the immediate area, for one thing, and second, such departments are constantly on-call and therefore "working on a 24-hour clock," he said.

But feelings are different in the corner office, from Amble's perspective. "I think generically, at a higher administrative level, there just isn't a clear understanding of the value of people working remotely," he said. "There's still this historic perception on the part of management in general that [telecommuters are] not working. Conversely, there is a very low percentage of people who take advantage of their companies."

Amble knows this all too well. He recalled taking over a department and subsequently learning one work-at-home employee had two other jobs, a revelation that traveled fast. "That one example went through the organization like wildfire," he said.

Jim Bird, founder and CEO of WorkLifeBalance.com, a training and support company based in Atlanta, said telecommuting requires "the right mindset and the right skill set. Either you can't get work done, or you never stop working and it consumes you.

"Even if you don't have a telecommute program, your people are working at home," Bird added. "They're answering those e-mails at 10 o' clock, they're working on the weekend."

Bird said companies should recognize the business value of off-site employees, and suggested this has already been widely put to the test. "When we're outsourcing technology jobs 10,000 miles away, we certainly should do it 10 minutes down the road," he said.

The study was conducted by the polling company, inc. and sampled 600 adults who are either working now or plan to within three years. The firm conducted additional interviews with 100 small-business owners. The poll has a margin of error of plus or minus 4 percent within the main sample, and 6.8 percent within the small-business sample, according to the organization.

Citrix Online makes a line of hosted software that caters to off-site workers.

Word is that AT&T is starting to pull some of their work at home employee's back into the office, I'll bet that isn't a popular decision.


Sphere: Related Content

Thursday, November 29, 2007

IBM bolsters Data Warehouse Offerings

eWeek is reporting that IBM has bolstered their Data Warehouse offerings, I hope to get to a chance to play with them soon. Here is their report eWeek - IBM Turns Attention to Data Warehouse:

IBM has launched a suite of integrated software to help customers better manage its data warehouses and business intelligence applications.

IBM's new DB2 Warehouse Performance Management Suite—an add-on to its DB2 database management system—makes it easier for companies to maximize their underlying system resources and takes advantage of the recently announced extreme workload management features in DB2 Warehouse 9.5, IBM officials said. With the software, IBM takes an end-to-end monitoring approach and offers insight on the entire data life cycle, from database and system configuration to data movement processes, they added.

"The new Performance Management Suite will further enhance our dynamic warehousing initiative, providing our clients with the industry's most comprehensive tools for maximizing the performance of their data warehouse," said Arvind Krishna, vice president of IBM Data Services, in a statement. "These new capabilities will deliver unprecedented insight for IBM customers to better manage their BI and data warehousing applications and infrastructure."

IBM made a major move in the area of BI technology when it announced plans this month to acquire BI vendor Cognos for $5 billion.

In addition to the software suite, IBM also announced it is partnering with Appfluent Technology. The companies are combining the query monitoring technology and workload analysis capabilities of Appfluent's software with the monitoring, deep analytics and optimization features of DB2 Warehouse to offer customers the insight they need to better manage the life cycle of user activity, applications, systems and processes.

Read more here about IBM's acquisition of Cognos.


Sphere: Related Content

Tuesday, November 27, 2007

10 Top IT Disasters by ZDNet, Plus One of My Own.

Sometimes IT disasters aren't all bad, in fact what I consider possibly the greatest IT disaster of all time, didn't make ZDNet UK Top 10 List, maybe because without it they wouldn't exist. What IT disaster could have changed the the course of a country if not the world.

I nominate the undetected cracking of the German and Japanese codes by the allies in World War 2. While it may have not been a disaster for Allies, it was most definably a disaster to the Axis powers. And since it involved the first real time application of a modern computer, it definitely involved IT, and probably was a greater blow to the Axis than the Atomic Bomb.

For the other 10, read ZDNet UK - The top 10 IT disasters of all time:

From faulty satellites nearly causing World War III to the Millennium Bug, poorly executed IT has had a lot to answer for over the years

Following the loss of the personal records of some 25 million child benefit recipients by Her Majesty's Revenue & Customs this month, the UK government will be acutely aware of how quickly mismanagement of technology can lead to serious problems.

While technology wasn't to blame per se in the HMRC data loss, there are plenty of recorded examples where faulty hardware and software have cost the organisations concerned dearly, both financially and in terms of reputation — and resulted in some near misses for the public.

Here's our considered list of some of the worst IT-related disasters and failures. The order is subjective — with number one being the worst — so feel free to comment using the Talkback facility below if you disagree or have suggestions for disasters we may have missed.

1. Faulty Soviet early warning system nearly causes WWIII (1983)
The threat of computers purposefully starting World War III is still the stuff of science fiction, but accidental software glitches have brought us worryingly close in the past. Although there are numerous alleged events of this ilk, the secrecy around military systems makes it hard to sort the urban myths from the real incidents.

However, one example that is well recorded happened back in 1983, and was the direct result of a software bug in the Soviet early warning system. The Russians' system told them that the US had launched five ballistic missiles. However, the duty officer for the system, one Lt Col Stanislav Petrov, claims he had a "...funny feeling in my gut", and reasoned if the US was really attacking they would launch more than five missiles.

The trigger for the near apocalyptic disaster was traced to a fault in software that was supposed to filter out false missile detections caused by satellites picking up sunlight reflections off cloud-tops.

2. The AT&T network collapse (1990)
In 1990, 75 million phone calls across the US went unanswered after a single switch at one of AT&T's 114 switching centres suffered a minor mechanical problem, which shut down the centre. When the centre came back up soon afterwards, it sent a message to other centres, which in turn caused them to trip and shut down and reset.

The culprit turned out to be an error in a single line of code — not hackers, as some claimed at the time — that had been added during a highly complex software upgrade. American Airlines alone estimated this small error cost it 200,000 reservations.

3. The explosion of the Ariane 5 (1996)
In 1996, Europe's newest and unmanned satellite-launching rocket, the Ariane 5, was intentionally blown up just seconds after taking off on its maiden flight from Kourou, French Guiana. The European Space Agency estimated that total development of Ariane 5 cost more than $8bn (£4bn). On board Ariane 5 was a $500m (£240m) set of four scientific satellites created to study how the Earth's magnetic field interacts with Solar Winds.

According to a piece in the New York Times Magazine, the self-destruction was triggered by software trying to stuff "a 64-bit number into a 16-bit space".

"This shutdown occurred 36.7 seconds after launch, when the guidance system's own computer tried to convert one piece of data — the sideways velocity of the rocket — from a 64-bit format to a 16-bit format. The number was too big, and an overflow error resulted. When the guidance system shut down, it passed control to an identical, redundant unit, which was there to provide backup in case of just such a failure. But the second unit had failed in the identical manner a few milliseconds before. And why not? It was running the same software," the article stated.

4. Airbus A380 suffers from incompatible software issues (2006)
The Airbus issue of 2006 highlighted a problem many companies can have with software: what happens when one program doesn't talk to the another. In this case, the problem was caused by two halves of the same program, the CATIA software that is used to design and assemble one of the world's largest aircraft, the Airbus A380.

This was a major European undertaking and, according to Business Week, the problem arose with communications between two organisations in the group: the French Dassault Aviation and a Hamburg factory.

Put simply, the German system used an out-of-date version of CATIA and the French system used the latest version. So when Airbus was bringing together two halves of the aircraft, the different software meant that the wiring on one did not match the wiring in the other. The cables could not meet up without being changed.

The problem was eventually fixed, but only at a cost that nobody seems to want to put an absolute figure on. But all agreed it cost a lot, and put the project back a year or more.

5. Mars Climate Observer metric problem (1998)
Two spacecraft, the Mars Climate Orbiter and the Mars Polar Lander, were part of a space programme that, in 1998, was supposed to study the Martian weather, climate, and water and carbon dioxide content of the atmosphere. But a problem occurred when a navigation error caused the lander to fly too low in the atmosphere and it was destroyed.

What caused the error? A sub-contractor on the Nasa programme had used imperial units (as used in the US), rather than the Nasa-specified metric units (as used in Europe).

6. EDS and the Child Support Agency (2004)
Business services giant EDS waded in with this spectacular disaster, which assisted in the destruction of the Child Support Agency (CSA) and cost the taxpayer over a billion pounds.

EDS's CS2 computer system somehow managed to overpay 1.9 million people and underpay around 700,000, partly because the Department for Work and Pensions (DWP) decided to reform the CSA at the same time as bringing in CS2.

Edward Leigh, chairman of the Public Accounts Committee, was outraged when the National Audit Office subsequently picked through the wreckage: "Ignoring ample warnings, the DWP, the CSA and IT contractor EDS introduced a large, complex IT system at the same time as restructuring the agency. The new system was brought in and, as night follows day, stumbled and now has enormous operational difficulties."

7. The two-digit year-2000 problem (1999/2000)
A lot of IT vendors and contractors did very well out of the billions spent to avoid what many feared would be the disaster related to the Millennium Bug. Rumours of astronomical contract rates and retainers abounded.

And the sound of clocks striking midnight in time zones around the world was followed by... not panic, not crashing computer systems, in fact nothing more than new year celebrations.

So why include it here? That the predictions of doom came to naught is irrelevant, as we're not talking about the disaster that was averted, but the original disastrous decision to use and keep using for longer than was either necessary or prudent double digits for the date field in computer programs. A report by the House of Commons Library pegged the cost of fixing the bug at £400bn. And that is why the Millennium Bug deserves a place in the top 10.

8. When the laptops exploded (2006)
It all began simply, but certainly not quietly, when a laptop manufactured by Dell burst into flames at a trade show in Japan. There had been rumours of laptops catching fire, but the difference here was that the Dell laptop managed to do it in the full glare of publicity and video captured it in full colour.

(Unfortunately, the video capturing the incident appears to have vanished from the web. If you happen to own a copy, please send it to us as it should make interesting viewing again.)

"We have captured the notebook and have begun investigating the event," Dell spokeswoman Anne Camden reported at the time, and investigate Dell did. At the end of these investigations the problem was traced to an issue with the battery/power supply on the individual laptop that had overheated and caught fire.

It was an expensive issue for Dell to sort out. As a result of its investigation Dell decided that it would be prudent to recall and replace 4.1m laptop batteries.

Company chief executive Michael Dell eventually laid the blame the for the faulty batteries with the manufacturer of the battery cells — Sony. But that wasn’t the end of it. Apple reported issues for iPods and Macbooks and many PC suppliers reported the same. Matsushita alone has had to recall around 54 million devices. Sony estimated at the time that the overall cost of supporting the recall programmes of Apple and Dell would amount to between ¥20bn (£90m) and ¥30bn

9. Siemens and the passport system (1999)
It was the summer of 1999, and half a million British citizens were less than happy to discover that their new passports couldn't be issued on time because the Passport Agency had brought in a new Siemens computer system without sufficiently testing it and training staff first.

Hundreds of people missed their holidays and the Home Office had to pay millions in compensation, staff overtime and umbrellas for the poor people queuing in the rain for passports. But why such an unexpectedly huge demand for passports? The law had recently changed to demand, for the first time, that all children under 16 had to get one if they were travelling abroad.

Tory MP Anne Widdecombe summed it up well while berating the then home secretary, Jack Straw, over the fiasco: "Common sense should have told him that to change the law on child passports at the same time as introducing a new computer system into the agency was storing up trouble for the future."

10. LA Airport flights grounded (2007)
Some 17,000 planes were grounded at Los Angeles International Airport earlier this year because of a software problem. The problem that hit systems at United States Customs and Border Protection (USCBP) agency was a simple one caused in a piece of lowly, inexpensive equipment.

The device in question was a network card that, instead of shutting down as perhaps it should have done, persisted in sending the incorrect data out across the network. The data then cascaded out until it hit the entire network at the USCBP and brought it to a standstill. Nobody could be authorised to leave or enter the US through the airport for eight hours. Passengers were not impressed.

For anyone wants who to know number 2 on the list was actually caused by AT&T laying off experienced techs and leaving the testing and upgrading of the switch code to less experienced techs to save money. Can anyone say Brain Drain.

Sphere: Related Content

11 Things Holding BackTech

Remember when they broke up AT&T, the reason that the judge gave was that the lack of competition was stifling innovation. So maybe becoming to dependent on one company or technology is bad for innovation. What other factors could be holding down innovation in Information Technology, ZDNet UK did a great job of putting together a list of Ten things holding back tech:

Ever get the feeling that we aren't quite yet where we want to be? Here are 10 factors that may be holding back the world's technological development

The pace of change in IT has never been faster — or has it? After 25 years of desktop computing and 15 years of the commercial internet, there are still plenty of frustrations, pains and throwbacks in our everyday technology experience. It's great having a terabyte hard disk, but not so great trying to manage it using interfaces and tools that have barely changed from the days when 40MB was respectable.

Many factors are holding back technology. Here is a list of 10 such barriers, in no particular order. We have almost certainly missed a few, so feel free to leave your comments using the Talkback facility at the bottom of the page.

1. Microsoft's stranglehold on the desktop
Windows unified the personal-computer market, and led it into the enterprise. A good thing, surely? Yes — if unity is more important than innovation, flexibility and a free market. The European Commission disagreed with that, as have courts around the world.

For most people, computing means Windows, not because they choose it but because the company's immense power in retail and business channels, together with the inertia that comes through decades of market dominance, make it a default that's hard to change.

So why does this hold back innovation? The European Commission ruled that computer users are unnecessarily used to products like Windows Media Player — applications that are mediocre just because Microsoft has no real incentive to make them better. Monopolies are anti-competitive and therefore anti-innovation. Just look at Internet Explorer's long stagnation.

Microsoft's stifling influence on new ways of thinking goes beyond applications, however. As Vista so readily proves, rehashing the same idea again and again does not make for progress. For everyone's sake, especially Microsoft itself, the company needs to learn to compete fairly again.

2. Operator lock-in
In Europe, we have only recently emerged from the dark ages of the mobile internet, as the market has forced operators to abandon the so-called "walled garden" approach. This meant that users could only access websites that had been pre-selected by their operator — the very embodiment of what net-neutrality advocates are seeking to block in the US. Of course, that debate revolves around fixed access, and is so relevant in the US because — unlike the UK — most of that country has very little choice of internet provider.

However, both situations show, or have shown, the harm that can be done to innovation when those operating the pipes of the internet decide they want control over content. Operators providing content is nothing new, nor is it a bad or surprising thing for them to do, but that provision needs to be in line with the founding principles of the internet if innovation is to flourish.

Any threat to the equality of access and provision on the internet is a bad thing for innovation, and a combination of the market and regulation is needed to hold such threats at bay.

3. Input methods
We haven't come far. Qwerty is 130 years old, and windows, icons, mice and pointers are 35. Both come from before the age of portable computing. So why are we reliant on these tired old methods for all our new form factors?

There are lots of new ideas — voice, gesture and handwriting recognition; video and infrared inputs that watch what we do with our hands and decide what it is that we want — but the mobile experience remains one of thumb-mangling, eye-straining frustration. A BlackBerry keyboard is a wonder of miniaturisation; shame the same's not true of most BlackBerry users.

Until we manage to break down the barriers erected between us and the machines back in the days before eight-bit processors, we'll be stuck back there too.

4. Battery life
All the newfangled input and display technology in the world doesn't amount to much when your handset and laptop struggle to support more than a few hours' hard usage.

Particularly on the handset side, the increase in processing power needed to support the internet and the mobile office puts huge demands on a device's battery, as do high-speed wireless data technologies like 3G — there is a good reason why the iPhone, which has to provide a reasonable simulation of the iPod's battery life, does not currently use 3G. Also, even when they refrain from exploding, the lithium-ion (li-ion) batteries used in a wide variety of electronic devices become less efficient over time. That means mobile technology will forever lag behind fixed technology.

But perhaps the greatest application for improved battery technology would be in electric cars. The concept is proven and on the street but, until it becomes possible to go as far on a charge as you would on a tank of fuel, only first adopters and urban eco-warriors will bother.

5. The mania for speed
Faster processors are great. However, there is more to computing than processor speed — a point which can be easily proven by comparing a two-year-old PC running Linux with a new PC buckling under the weight of Vista. Shrinking the manufacturing process to enable greater speed has proven essential, but it's running out of magic.

Too much R&D time and money goes into processor speed when other issues remain under-addressed. For example, could data not be handled a bit better? What about smarter ways of tagging data? The semantic web initiative runs along these sorts of lines, so where is the hardware-based equivalent?

It is all very well to be able to run the latest DX10 games on your PC, but untold mould-shattering developments lie on the other side of a concerted effort to rethink the nature of the computer. Whichever chipmaker becomes the first to...

...think beyond speed alone will gain a whole new advantage over its competitors: smarter, not faster, will lead to both smarter and faster.

6. Intellectual property law
John Tehranian, a University of Utah law professor, has worked out that someone doing a job like his could, under US law, be committing more than 80 infringements of copyright a day — even without any P2P file-sharing shenanigans — and end up with multi-billion-dollar fines every year. Even whistling a tune in public is a multi-thousand-dollar mistake.

Intellectual property law is broken. Creativity needs protection, but the current system isn't working. Designed to encourage inventiveness and the building of ideas on ideas, it instead rewards power and influence with more power and influence. The ideal world of the intellectual property lawyer is one where nothing can move without permission; no idea can happen unless it is approved.

This is no model for a world where ideas can spread like never before and information is freer than even the most utopian could have imagined 50 years ago. A new way of thinking about information ownership is needed, and quickly.

7. Skills inequalities
Applications and technology might become more intuitive and creative if more women were involved in the industry. Diversity breeds innovation.

Technology has traditionally been terrible at attracting anyone but the technically minded. Seen by many as incredibly dull and exclusive, the industry most needs the influence of those who give it the least thought. Even the best technical process could benefit from a little humanity.

Industry is also waking up to the developing world and beginning to hear its voice. Technology has the capability of leapfrogging the biggest problems, but only if it's built to match the needs of the people it serves.

The more IT listens to and gives power to those it has traditionally excluded, the better it will be suited to solve real problems for us all.

8. Web 2.0
Speaking of daft innovations that do little to better the lives of humanity, Web 2.0 has a lot to answer for. So the web's gone two-way. Great. But the extremes of enthusiasm shown by financiers and business people are verging on counterproductive.

Do we really need applications like Twitter? What price a poke on Facebook? Microsoft's recent purchase of a chunk of Facebook valued the social-networking company at $15bn (£7.2bn). This is a company that does not yet have a proven business plan, despite having big aspirations as a marketing hub. Two years ago, eBay bought Skype for $2.6bn and Skype — a mostly free service — is currently struggling to justify that price.

It's nice to see the vanguard cashing in. But they're not really worth their valuations or the mountains of cash they have received from venture capitalists, whose money could probably find better use in other areas of technological innovation.

With the global economy in its current, credit-crunched state, Web 2.0 runs the risk of not only taking funding away from worthier areas of research but also contributing to a downturn that may hit the tech industry particularly hard. It remains a crucial element of the way we interact through technology, but its business models need a lot of work.

9. National interests
Every country places a high value — often the highest of values — on the rule of law. So why do they insist on behaving towards each other in a state of virtual anarchy?

If we view technology as a globally collaborative effort, one of the clearest barriers to its development is that of national interests. Look at the interminable arguments in organisations like the International Telecommunication Union. Countries defend the interests of their indigenous corporations and lobby groups; the idea that these interests may be better served in the long term by ceding ground in the short is as popular as skinny-dipping in the Antarctic.

Sometimes it is hard to escape the notion that certain countries are deviating from the pack just for the sake of it, much as Napoleon and the US had horses and carriages use the right-hand side of the road for no other reason than the British used the left.

Despite the upcoming Olympics, China is still dragging its heels over the deployment of 3G because it wants to use its own home-grown standard, TD-SCDMA. Its motivations for this include avoiding payments to western patent holders, but the main driver is the fact that China has a large enough internal market to not have to worry about inconsistencies with international norms. Overall, progress is yet again slowed down.

Some national interests have an almost absurdly negative effect on international technological development. For years, the US government classified encryption technology as a munition, and had export laws that forbade the distribution to the world of chips using the RSA algorithm. The ban proved unworkable in the long term but, for a long while, it seriously held back the development of security technology around the world.

10. The current lack of global wars and/or disasters
Forget peace, love and understanding. For a real boost, technology needs war. World War II gave us radar, rockets, the jet engine and digital computing. It also gave us 50 million dead.

These days, warfare still results in misery and death, but the technological benefits are harder to appreciate. There's not much in a stealth fighter or bomb-disposal robot that helps away from the battlefield.

Let's stick to metaphorical warfare. That's something politicans are good at promoting, but bad at executing — the "war on drugs" and the "war on terror" both sound good but have generated little of note, beyond copious government expenditure on ever more inventive ways to annoy their own citizens.

If we must have war, we might as well use it wisely. The biggest threats to mankind are environmental change, disease and international political and economic upheaval. Putting the nations of the world on a war footing against this terrible triad would produce a flowering of new, focused thinking and technologies — and nobody would get hurt.

I can think of at least of one more, the lack of continuity of application development teams. It takes developers several months just to become productive, the way that people trade jobs, moving from one position to another can not help but have a detrimental effect on the productivity of any development team, far more than is gained by having "fresh blood" added from all the inter company movement. Employers need to take seriously the effect of brain drain on the productivity of their tech teams, and by reference their companies bottom lines.

Sphere: Related Content

Software Testing is Changing as Service-Oriented Architecture Gains Popularity

I recently ran across and interesting post on how the world of QA engineering and testing is changing, as applicatoin development becomes more and more Web/Service based. The expertise demands placed on on testing organizations has increase markedly. Testers need to know more about wide array of complex technologies from performance to security, and these increasing demands don't show any indication that they will abate anytime soon. Today we have just started testing widgets on web pages, tomorrow we'll be testing widgets and how they interact with each other.

Nothing is constant in Information Technology except change and testing is no exception, check out the article Search Software Quality - SOA prompts changes in quality assurance:

The times they are a-changin' for QA engineers and testers -- or at least they should be if their organizations are developing service-oriented applications.

According to a recent Aberdeen Group report, significant change is afoot in quality assurance, with the focus turning to quality throughout the end-to-end business transaction. On top of that QA needs to add to its responsibilities of unit and functional testing and take into account integration, regression and business process testing, as well as performance and security testing. In addition, service-oriented architecture (SOA), composite applications and Web services require QA to be a continuous process, and testers need to know more about both the technology and the business requirements.

"Folks taking a transaction view -- how the services are interacting across the enterprise -- those guys are by far outperforming everyone else in terms of quality," said Perry Donham, director of enterprise applications research at Boston-based Aberdeen Group Inc. and author of the report "SOA and Web Services Testing: How Different Can It Be?"

Along with a transaction view, testing organizations must also consider the business view, "a theme running through my research," Donham said. As the emphasis has shifted from IT driving business to business driving IT, IT is turning to SOA and Web services to provide the flexibility to respond to those business requests, he said.

"So you can't treat a service or set of services as self-contained; they interact across the business, and you need to understand the business view of the transaction," Donham said. "Functionally it may work and the unit test works, but from the business process it doesn't work. A lot of folks are missing that. At the end of the day, you want to solve some business problems."

According to the Aberdeen report, which surveyed 240 end users, tracking business requirements is a key component of the new way for building software. Eighty-one percent of what Aberdeen categorizes as "best-in-class" companies among those surveyed use automation to manage those requirements across the lifecycle.

Testing gets more complex
SOA, composite applications and Web services also increase the testing complexity. In the past when you'd built an application and tested it, you knew what the inputs were and what the outputs should be. It was fairly simple to do a QA test, and the interaction among the applications was pretty straightforward, Donham said.

"Now, with an individual service you can unit test, but once you put it into the soup of services interacting, defining an application [isn't] clear," he said. "Instead of looking at the services individually and unit and functionality testing, you need a transactional view. That may mean testing the new service, but it also means integration and regression testing. We've always done integration testing, but on a traditional application and how it behaves interacting with other applications. Now it's not only how it interacts with other services, but with other versions of those services."

Validating how all the components of a service-oriented application work, both individually and together, "is the single biggest sea change" for QA, said John Michelsen, chief scientist and co-founder of iTKO Inc., a Dallas-based developer of SOA testing and validation solutions and a sponsor of the Aberdeen study.

Michelsen also said that instead of one large team delivering a solution annually, there are now dozens of small teams collaborating. And because there is no one project owner, the components are iterating on their own lifecycles, which impacts the behavior of the application.

"You may use my services, but then I'm changing the behavior of your application because I'm changing my services," he said. "Testing is now a continuous activity that has to be validated. Continuous validation of tests is one of the main changes with SOA."

Top challenges for SOA testing
The Aberdeen survey found that the top challenges related to SOA and services testing, on average, included incomplete technical specifications, 41%; QA tools not designed for SOA or Web services, 33%; and not enough time allotted for testing, 42%.

Additional pressure for QA testers, Donham said, is that they don't have domain knowledge.

"The traditional QA role is often button pushing -- they're given a set of specs and they test to those specs," he said. "What you really need for quality is to have domain knowledge applied to QA. In many cases, that means the business analyst sitting with the QA tester, providing inputs into the test case."

According to Michelsen, the issues of software quality that have plagued the industry do not change with SOA.

"Software is still the weakest link in technology," Michelson said. "If we built iPhones the way we built software, this thing would never have released. For some reason, we tolerate significant failures in software despite that fact that we know how to do it on the hardware side. There is still plenty of opportunity to get better at this."

Indeed, Aberdeen found that only 7% of respondents have completely redesigned QA; most (65%) are changing their processes incrementally, on a project-by-project basis. And 60% of the best in class have trained key developers, business analysts and testers in SOA and Web services technology, vs. just 35% of the industry average.

Also, on average, Aberdeen said respondents are performing compliance, orchestration and security testing at "alarmingly low rates." Commented Donham, "We're just not there yet."

Getting around the SOA testing curve
There are steps organizations can take to move themselves further along the SOA testing curve. One is the adoption of automated testing tools to help with the complexity and layers of testing, according to Aberdeen.

"Testing and validation is the single biggest way to reduce production outages," Michelson said. "But the big challenge is can we automate enough of the testing to deliver to production with minimal outages or issues?"

The use of automated SOA and Web services testing tools is already differentiating the best-in-class companies, according to Aberdeen. Fifty-seven percent of best in class use them compared with 29% of what Aberdeen deems as "laggards."

The best in class are also implementing process change at the organizational level, incorporating business users at all phases of the lifecycle and taking a lifecycle view of quality. The survey found that 70% of best-in-class firms measure quality throughout the project lifecycle, not just in testing.

Sphere: Related Content

Here is a Secret, Software Production Management Software Pays For Itself

I've personally use Software Production Management software, I do believe it can boost productivity.

Typically, as part of a CARS testing effort we use Compuware's Production Management software to automate not only the software builds but the testing process itself.

Whether you a doing a software build or implementing a testing methodology, exact repeatition of an established process is key. This is where Production Management software shines, it ensures that what ever effort you working on is done in the same manner as expected, and tracks the completion of the process, step by step.

When exception to the rules occurs, planned steps are taken to handle the exception. When milestones are reached in the process, reports and emails are issued automatically.

But you don't have to take my word for it InfoWorld has the story Software production management could boost productivity:

Organizations could save as much as $2 million and increase productivity by as much as 15 times through use of "software production management," tools, according to an industry analyst who has just published a report on the issue.

Software production management emphasizes the acceleration, analysis, and automation of the software build, test, and deployment process, said Theresa Lanowitz, founder of voke, which released a report this month entitled, "Market Snapshot: Software Production Management."

The report was based on a survey of Fortune 500 companies, enterprise IT organizations, and ISVs. Lanowitz would not say who funded the study but said questions used were not product- or brand-specific.

Software production management is becoming important as applications get more complex and development houses are worried about delivering high-quality software, Lanowitz said. Complexity is building because of new requirements and more demands.

"With respect to software production management, [organizations are] really concerned about being able to more effectively, more efficiently manage the build [process]," Lanowitz. This process must be integrated with other parts of the application lifecycle, such as testing, she said.

To boost the build process, builds should be done on a more frequent basis to weed out defects earlier, Lanowitz said. Voke also is recommending use of software production management tools, such as IBM Rational Build Forge, Electric Cloud ElectricCommander, Borland Gauntlet, or open-source products like CruiseControl.

"Once [organizations] started using the software production management tools, what they found was that their broken builds disappeared," she said.

Through software production management, savings of as much as $1 million to $2 million were cited by 40 percent of survey participants. Savings occurred through decreased wait time for builds, faster time to market, and eliminating the development and maintenance of internally built tools for software production management.

Productivity increases of as much as 10 to 15 times were noted by 33 percent of respondents.

Electric Cloud's ElectricAccelerator product, for example, has helped Symantec reduce the time it takes to get a software build over to quality assurance personnel, said Maria Morris, senior director of software quality assurance at Symantec.

"It's really important for us," because Symantec must produce as many builds as it can to get critical fixes to quality assurance staff as often as possible, she said.

While software production management plays into the theme of agile development, which involves short, regular releases of software during development process, it is not limited to agile.

The report also covers the concept of "Application Lifecycle 2.0." This concept is about organizations being able to communicate and collaborate better with partners and using processes, technology and application lifecycle technologies such as virtualization, software production management, and source code analysis. With virtualization, software development houses use virtual lab automation, in which a production environment is set up. The environment is at a neutral state for testing.

If you haven't tried Production Management Software for yourself, I suggest you give it a try. I think one of its hidden advantages is it establishes accountability among the process players. You won't have a process stalled because it's waiting approval by a manager once you have Production Management in place because the time between when a task is received to when it is acted on and completed is tracked and on display to all to see. You will be surprised how fast it will improve your performance and pay for itself.

Sphere: Related Content

Saturday, November 24, 2007

China Plans Cyberwarfare Says Congressional Report

No longer content to just meddle with our election system via illegal campaign contributions the Chinese are undertaking a program to disrupt our Communications Networks, in September they actually attempted to disrupt Defense Department Computers. Which triggered and immediate investigation, Congress has just received a report that explores China's attempts to disrupt networks across the U.S.

Check out the article from Redstate:China's Plans to Disrupt American Communications Networks:

You may or may not have been paying attention early last September when several unusual disruptions rolled through Defense Department computer networks with links to the Internet. News reports at the time blamed the disruptions on a deliberate "cyberwarfare" attack by China's People's Liberation Army.

According to this story, Congress has just received a report that explores China's drive to acquire the capability to disrupt computer networks in the United States, most likely as part of fighting a war against us.

More...

Needless to say, back in September when the accusations surfaced in the world press, China scurried to deny the story and to accuse the US of thinking like there was still a Cold War on. From the Financial Times story linked above:

China yesterday strongly denied reports that its military was behind a hacking attack on the Pentagon computer network this year.

Jiang Yu, a foreign ministry spokeswoman, said at a regular news briefing that the accusations against China were "absurd".

"The criticism is un-founded, which represents cold war thinking," she said.

I'll let you evaluate the credibility of these statements for yourself.

Now the Defense Department runs a lot of networks, and in fact networked communications are a core element of our approach to war-fighting. But while DoD's public networks (the ones with .mil top-level domains) are linked to the open Internet and thus are vulnerable to hacking activities, the Pentagon also has extensive networks that are totally private. Those are not the subject of this discussion.

The report just submitted to Congress is from the US-China Economic Review Commission, and quotes the commander of the US Strategic command. From the eWeek article:

If the United States and China were to find themselves in an armed conflict, China is likely to launch cyber attacks on American regional bases in Japan and South Korea, and might even include cyber attacks on the U.S. homeland that target financial, economic, energy and communications infrastructures.

According to Gen. James Cartwright, commander of the U.S. Strategic Command, China is already actively engaging in cyber-reconnaissance through the probing of computer networks of U.S. government agencies and private companies.

Cartwright said the data collected from these reconnaissance probes can be used for many purposes, including identifying network weak points, understanding how U.S. leaders think, discovering the communication patterns of government agencies and private companies, and gaining valuable information stored throughout the networks.

The latter point is a key one. You can get a lot of insights about an enemy simply by performing "traffic analysis" and generally observing his information flow. And these insights can provide a valuable edge in war.

So what if things heat up with China, let's say in a tiff over (likely) protectionist legislation or (less likely) Taiwan? Let's assume that the Chinese have discovered ways to degrade or even disable our communications infrastructure. This of course will have an impact on our whole national life, not just our military capabilities.

What impact? Just as with strategic bombing, that's not a question with a clear answer. According to General Cartwright:

"I think that we should start to consider that regret factors associated with a cyber-attack could, in fact, be in the magnitude of a weapon of mass destruction," Cartwright told the Commission, referring to the psychological effects that would be generated by the sense of disruption and chaos caused by a cyber-attack.

And the counterpoint, by James Lewis of the Center for Strategic and International Studies:

"The effect is usually to solidify resistance, to encourage people to continue the fight, and if you haven't actually badly damaged their abilities to continue to fight, all you've done is annoy them, and what many of us call cyber-attacks [are] not weapons of mass destruction but weapons of mass annoyance," Lewis said.

I don't happen to believe that the Chinese include war against the US as a necessary part of their grand strategy. (Plenty of people in and out of the defense establishment vehemently disagree with me.)

But I think there's not the slightest question that the Chinese are preparing for war against us. There's no other credible enemy that they could possibly face, and if I were them I'd never assume that the US will not strike first.

Additionally, it might be useful even in everyday economic competition for them to degrade our normal network communications, through such capabilities as massive email-spam attacks.

Leaving aside a discussion of the circumstances under which we might go to war against the Chinese, let's recognize that they have little hope of ever matching our logistic and conventional capabilities. Therefore they won't try to do that.

But they will seek "asymmetric" advantages, with which they can blunt our effectiveness both politically and technologically in the event that we attack them. Hence the emphasis on cyberwarfare, and on capabilities such as the satellite killer they messily and noisily tested last January.

I'm not willing to judge China's ability to disrupt our national economy through cyberattacks. Modern communications networks probably have ways of staying resilient and functional that are not designed-in, but arise from their decentralized structure. But they have never been tested under full-scale war conditions, so we don't know.

Wall Street firms were deeply frightened by how fragile and vulnerable their telephone and data networks proved to be on September 11, 2001. By both voluntary and government-mandated action, they moved swiftly to geographically diversify both networks and people operations away from lower Manhattan. The lessons learned then will prove invaluable in recovering from cyberattacks in a war against China.

These are all important questions. But regardless of how we answer them, doesn't it stick in your craw to know that the Chinese are snooping around in your business, and spying on you?

Let's never forget that the Chinese fight dirty, both as economic competitors and as a geopolitical adversary. Perhaps we should be less hesitant to fight dirty against them.

Sphere: Related Content

Friday, November 23, 2007

Spam Will Outnumber Legit Email For First Time: Researchers

According the Read/Write Web and USA Today for the first time spam will Outnumber Legit Email by years end. But according to my Inbox its been that way for some time, the good news is Google has a really good Spam filter. When will these guys learn that spam just makes people mad, problem is there are still millions of people that are fooled bit the scam...

Take a look at Read/Write Web's post Spam Will Outnumber Legit Email For First Time: Researchers:

Reasearchers at IDC predict that this year the number of spam messages sent will eclipse the amount of legit email correspondence for the first time ever, reports USA Today. Approximately 10.8 trillion spam messages will have crossed through inboxes in the past year, compared to 10.5 trillion legitimate person-to-person email messages. The numbers indicate that spam is a growing problem, despite the promise of better filtering technology.

"Two years from now, spam will be solved," said Bill Gates in 2004 addressing World Economic Forum in Switzerland. But 2007 will go down as the worst year yet for spam, a trend that has held for the past four years, according to Rebecca Steinberg Herson, vice president of marketing at Commtouch, an email security firm.

In an interview last month with USA Today, Gates reminded that though the volume of spam has increased, more of it is being deleted by spam filters. "Sure, there's a lot [of spam] out there, but software is deleting 99.9% of that anyway," he said, though Microsoft has revised that claim to 85% - 95%.

Gates does have a point -- spam filters are getting better. For example, last January, 50% of all spam was image based, but due to better filtering technology, Symantec now says that the percentage of image spam has dropped to just 8%. These quick advances in spam-fighting technologies force spammers to constantly change and update their methods. Eventually, researchers believe that better filters will begin to reduce spam from a colossal pain to a minor annoyance.

"As more people have inboxes protected by better and better spam filters, their experience of spam gets closer to Gates' vision," Richi Jennings, lead analyst at email security at Ferris Research, told USA Today. "He was a bit overaggressive with the prediction, of course. But spam isn't an easy problem to solve."

Sphere: Related Content

Massive Data Breach In The United Kingdom by the U.K

The VallyWag has brought to my attention yet another massive consumer data breach, this time it isn't a bank, or a retailer, this time it is an entire government at fault. I've written before about how as data is more concentrated and made more available via the web, the more likely these breaches will be, and the more massive. How massive this time, try 40 percent of the entire United Kingdom.

Check out the Valleywag Story:British government misplaces data on 40 percent of population:

Don't forget the original reason to be thankful: We're not Brits! Yay! And this week, our gratitude extends beyond straight teeth. The U.K. government has admitted that it lost computer disks containing detailed personal information on some 25 million citizens, or about 40 percent of the population. The disks, still unaccounted for, held bank account numbers, names and national insurance numbers, according to reports. The government also lost data included information on almost every child under the age of 16.

Sphere: Related Content

Post Thanksgiving Resolution, Take A Walk And Record It

The Future Pundit has and excellent suggestion on this day after the big feast, take a walk and then record it. According Dr Dena M. Bravada of Stanford University, in her study evaluating the association between pedometer use and and physical activity, Pedometers radically increase their users physical activity, particularly when the devices compare performance to goals.

From her study,
Get a pedometer to measure your walking and adopt a goal.

Dena M. Bravata, M.D., M.S., of Stanford University, Calif., and colleagues evaluated the association between pedometer use and physical activity and health outcomes among adults. The authors searched databases for studies and articles on this topic, and identified 26 studies with a total of 2,767 participants that met inclusion criteria (eight randomized controlled trials [RCTs] and 18 observational studies). The participants’ average age was 49 years and 85 percent were women. The average intervention duration was 18 weeks.

In the RCTs, pedometer users significantly increased their physical activity by 2,491 steps per day more than control participants. Among the observational studies, pedometer users significantly increased their physical activity by 2,183 steps per day over baseline (2,000 steps is about one mile). Overall, pedometer users increased their physical activity by 26.9 percent over baseline. Among the intervention characteristics, having a step goal was the key predictor of increased physical activity. The three studies that did not include a step goal had no significant improvement in physical activity with pedometer use in contrast to increases of more than 2,000 steps per day with the use of a 10,000-step-per-day goal or other goal.

Intervention participants significantly decreased their body mass index by 0.38 from baseline. This reduction was associated with older age and having a step goal. Participants also significantly decreased their systolic blood pressure by 3.8 mm Hg, which was associated with greater systolic blood pressure at baseline and change in steps per day.

The Future Pundit recommends:

Use technology to give you immediate feedback in progress toward goals. Makes sense.

I'm thinking I ought to get a pedometer.Go for the the pocket Omron HJ-720ITC or the corded Omron HJ-112? Does the pocket one work as well?

Sounds good, I think all add Omron HJ-720ITC Pocket Pedometer with Advanced Omron Health Management Software to my Christmas List. The software looks to be a MS .Net application with a Access backend, I'll look over the software to get a better idea of how it works and links back to the Pocket Pedometer via a standard PS2 to mini PS2 connector.

I could stand to loose about 50 pounds or so and if I can plug this into a project, I might just get into it, I've got an idea about and even more effective way of tracking goals and performance.

Sphere: Related Content

Wednesday, November 21, 2007

Nasty Trojan disguised as personal emails addressed to you

I just ran across this, thought I'd put out the word, this sounds like it could become a bad one, from the Register, New emails address you by name, then try to hose your PC:

Beware of emails that mention you and your company by name and claim to be official communications from the US Department of Justice. They're phony and will attempt to install malware on your machine.

The emails, which claim to reference a complaint recently filed by a business associate, invite the recipient to click on an attachment that contains a nasty Trojan, two separate security firms, MessageLabs and Websense, are reporting.

The practice of trying to extract sensitive information by sending highly personalized emails is generally called spear phishing, and it's proved to be successful in the past.

In May, security researchers from SecureWorks reported that emails purporting to come from the Better Business Bureau duped 1,400 business managers into installing a post logger on their machines. The loggers siphoned sensitive information that passed through the victims' Internet Explorer browser, including social security numbers, account numbers, and data that normally would be securely cloaked behind SSL defenses.

Spear phishing emails are notable for their impeccable grammar and spelling, a characteristic that distinguishes them from many of the plain vanilla phishing scams out there. Other recent spear phishing campaigns have masqueraded as emails from the Federal Trade Commission.

According to Websense, none of the major anti-virus companies detect the Trojan included in the fake Justice Department emails. That's likely to change in the next 24 hours, if it hasn't already.

Sphere: Related Content

Wednesday, November 14, 2007

Oracle Announces VM Included Free

It never good for a software company when another, larger software company decides to give away for free the some functionally as your primary product, just ask the guys over at Mozilla.

So the managers over at VMWare must be scratching their heads and polishing their resumes at Oracle's announcement that the will be including Free VM in their upcoming offerings.

Here is part of eWeek's take on the announcement Oracle Free VM Offering Raises Eyebrows:


Oracle is the latest player to jump into the virtualization market, and the news helped cut the price of VMware shares.

But VMware did not hesitate to respond to the announcement Oracle made Nov. 12 at its OpenWorld conference in San Francisco regarding its plans to offer free server virtualization software.

"Oracle's introduction of yet another variant of Xen is clearly a response to the significant virtualization industry that VMware has established," said Parag Patel, vice president of alliances at VMware. "The offering does not address the capabilities required to achieve the cost savings and IT simplification that customers are realizing everyday from VMware's Virtual Infrastructure."

The sniping was in reaction to Oracle VM, which will be available for free download on Nov. 14 and which Oracle officials contend offers scalable, low-cost server virtualization up to three times more efficient than rival offerings. The product, which is based on the open-source Xen hypervisor technology, features both Windows and Linux support and includes an integrated Web browser- based management console.

To speed installation and deployment, Oracle VM utilizes preconfigured Virtual Machine images of Oracle Database and Oracle Enterprise Linux. In addition, it is certified to work on Oracle Database, Oracle Fusion Middleware and Oracle Applications.

"Customers can now optimize resource consolidation by deploying Oracle VM with Oracle Unbreakable Linux and run the full Oracle software stack—Oracle Database, Oracle Fusion Middleware, Oracle Enterprise Manager and Oracle Applications—all with one worldwide support call," said Edward Screven, Oracle's chief corporate architect, in a statement.

"With Oracle VM, customers can respond more rapidly to business changes, increase ROI and reduce lifetime total cost of ownership. Oracle VM brings enterprise-class support and backing to server virtualization, giving customers the confidence to deploy virtualized solutions," he said

VMWare needs to take a good look at this new Oracle VM product and see if they can differentiate themselves from it. If not I'd be looking into making a deal with them.

Sphere: Related Content

New Trend, Outsource Your Security Operations to the Telecoms

I've talked about the coming of Cloud Computing to the Industry, but I haven't talked about another trend, outsourcing web based security to the telecommunications carriers.

It makes sense to centralize, and standardize security efforts by outsourcing them to the telecom carriers as it assures a more dependable, cost effective security solution, especially today when the majority of corporate data centers are decentralized, but as Cloud Computing takes hold I'm not sure that the model will still hold, time will tell...

Check out this from Infoworld's article Carrier security outsourcing making connections:

Massive telecommunications carriers, including AT&T, BT, and Verizon, are promoting their ability to take over a significant portion of customers' IT security operations, and some enterprises are already buying into the model.

Over the last several years, the sprawling carriers have acquired and introduced a range of technologies and services that promise to help protect customers from the growing wave of IT-based threats, including denial-of-service attacks and Web-borne malware programs.

And while the carriers all concede that delivering safe telecommunications and Internet access to their customers' doorsteps is a fundamental requirement of their core business, the companies also maintain that they are uniquely positioned to become broader security outsourcing partners for their clients.

Florida-based First Advantage is already outsourcing a significant portion of its IT security operations to its carrier partner, Verizon Business.

The publicly held risk management services company had previously seen a payoff from outsourcing elements of its call center and software development operations, and the increasing complexity of its security and compliance concerns made it a natural to enlist Verizon to take over more of the work, executives said.

"Compared to the cost of internal operation, price was obviously a major consideration, but it was also the idea that these are specialists in security who we would be turning to," said Isabelle Theisen, chief security officer at First Advantage.

"They can provide a level of monitoring and correlation that would not have been possible for us to achieve internally," she said. "And we wanted to take a more leading-edge approach to matching potential threats to specific assets, something that alone would demand a full-time team for us otherwise."

Theisen estimates that First Advantage, which specializes in employee background screening and insurance fraud investigation, has already ceded approximately 70 percent of its IT security operations to Verizon.

Among the services it consumes from the carrier, which include everything from AV (anti-vius) and IPS (intrusion prevention systems) to Web applications firewalls and operation of compliance-mandated server farms, many came to Verizon via its July 2007 acquisition of CyberTrust.

However, the seeds that have grown into the company's portfolio of security outsourcing skills were taking root years before the deal for the MSS (managed security services) company came to pass, Verizon Business officials said.

"We saw tremendous growth in demand for additional security services starting in 2005 and knew we needed to scale up; adding CyberTrust has helped us expand rapidly, and we're seeing even greater demand today, especially among multinational customers," said Cindy Bellefeuille, director of solution and product marketing at Verizon Business.

Some industry watchers have said that customers will increasingly expect carriers to eliminate many security threats as part of their core connectivity services just as they require the companies to guarantee network performance speeds in their SLAs (service level agreements).

However, Bellefeuille said that companies such as Verizon can meet those demands while creating new opportunities for additional services aimed at thwarting targeted threats or providing automation services, such as the Sarbanes-Oxley compliance server operation the company oversees for First Advantage.

"There will always be opportunities from an attack perspective for us to take action and protect customers on the backbone, and we'll do that," she said. "We've also driven a lot of internal innovation in last three years for fighting issues such as targeted attacks; we're building out the honeynets and doing more correlation of data. Now we're in the phase of launching services as both stand-alones and value-adds."

Verizon marketers foresee a future wherein the company could become a provider of end-to-end security outsourcing services.

While a majority of the services it has offered thus far have centered on external issues such as helping its customers ward off DoS threats and malware, the company is already getting its hooks into more internal security operations, such as identity management, filtering data pulled in by IDSes (intrusion detection systems), and providing protection for various types of databases and software applications.

While it makes senses to centralize security at the telecom central office, it also makes sense to centralize computing services in general in a "computing cloud", where not only security and network administration handled by a professionals, but so are capacity planning, and database / application adminstration.

Maybe the recent rumors concerning Google buying Sprint, and AT&T buying Yahoo! aren't so crazy after all.

Sphere: Related Content

Tuesday, November 13, 2007

Porn Used To Break CAPTCHA Security

To all Web Page/Application Designers, who believe CAPTCHAs will ensure that your pages will remain protected from Web Spammer and Evildoers, take a look at this article from Read/WriteWeb

A virtual stripper named "Melissa" that promises to progressively remove items of clothing for viewers who solve online CAPTCHAs is actually part of a scheme by spammers to crack web site registration traps meant to keep them out, reported security researchers this week.

Every time a user correctly enters the text on a CAPTCHA, the user is rewarded by Melissa removing another item of clothing. The catch is that the CAPTCHAs are being fed from real services, like Yahoo! Mail's signup process. So users looking for a free skin show are actually helping spammers and scammers thwart online security measures that usually keep their robots out.

"They're using human beings in semi-real time to translate CAPTCHAs by proxy," Paul Ferguson, a network architect at Trend Micro, told IDG News. "You have to give them this, it's clever." Certainly it is. Human intelligence has often been used in this way to solve problems that computers struggle with. We've previously reported on the reCAPTCHA system (here), which uses the spam-fighting images to digitize books, and on GalaxyZoo (here), which utilizes human input to identify and categorize galaxies.

According to Trend Micro, the Melissa striptease is part of a Trojan called CAPTCHA.a (Symantec calls it Captchar.a) that attacks Windows PCs. This isn't the first time spammers have employed humans to try to crack CAPTCHAs, said Trend Micro. "Work-at-home money mule schemes run by criminals have hired people to do this same thing," Ferguson told IDG. "They're told to log on to this Web page and type the CAPTCHA. They have a quota."

For now the threat appears to be benign -- used only to register free email accounts to flood chat rooms with unsolicited marketing pitches -- but there is a worry among security researchers that the same technique could be used for something more diabolic, like breaking into financial institution web sites.

I must say the Evil Doers are really getting creative, using porn addicts to assist them in breaking CAPTCHA security. I look for the initial targets to be high value targets, such as banking and other financial institutions as I would tend to believe they might have trouble attracting enough porn addicts to do the translations on a mass scale.

But if I were designing Web Applications, I would look for a substitute for the CAPTCHA security, because I could be wrong.



Sphere: Related Content

H1B Update: I.T. Companies 1, I.T. Workers 0

The Editorial Team over at SearchSAP.com is reporting that according to Information Week, the Congress Critters have eliminated the proposed increase in H1B application fee's from the proposed H1B bill that is working its way through joint committee. The increased H1B application fees were to be used to provide new scholarships for U.S. students up to $15,000 annually for qualified computer science students.

Information Technology Industry Groups have opposed the fee increases because they would make the H1B program not as effective.

By complaining that the increased fees would hurt the effectiveness of the program, they are admitting the purpose of the H1B program, instead of filling critical shortages in specific areas, is in fact the limitation of I.T. industry salaries.

If the purpose of the program was just to fill critical shortages, they would happily pay the increased fee's to get the help they so badly need, especially if the fees were used to increase the talent pool available to them in the future to fill future shortages.

Instead they just want to increase the number of H1B positions available, to artificially lower I.T. salaries by flooding the market with foreign workers, willing to work for below market wages in order to gain access to life in the U.S.

For far to long the Information Technology companies in this country have paid too much attention to their MBA'd managers, who are primarily concerned with keeping salaries artificially low so that they continue to reach falsely construed Wall Street expectations. These same companies, who are dependent on their I.T. Staffs to produce the actual product they market pay little or no attention to welfare of their I.T. workers.

I.T. workers across the country have seen their standard of living stagnate if not decrease over the life of the H1B program, because of the flood of foreign workers.

For more on this check out H1B visa update: No increased fees — SAP Watch:

H1B visas have remained a hot button topic for SAP professionals for some time, and we’ve covered it quite a bit over the years. Last we heard, there was legislation put into motion to more than triple the H1B application fee, from $1,500 to $5,000. The influx of money would be used to fund new scholarships for U.S. students to the tune of $15,000 annually for qualified computer science students.

Well, that probably won’t happen after all. InformationWeek just reported that the amendment is most likely about the get the boot, which comes as good news to Microsoft, Sun and others who have gone on the record to support easing and increasing H1B visas. They view the H1B visa program as a key component of staying competitive and adequately staffing projects with the best and brightest in the world.

It's time that we stand up tho these I.T. Industry MBA's and their fake shortage H1B scheme, they are not interested in filling shortages, they are interested in filling their pockets. H1B is little more that giving I.T. companies license to print money for their own use.

Call, phone, fax and email your congressional representation on this travesty of a bill, and tell them to vote no on the revised bill.

Sphere: Related Content

Monday, November 12, 2007

LA "BotMaster" Caught, Up To 60 Years In Prison Possible

Looks like they caught one, a L.A. man admitted on Friday that he had infected 250,000 computers with "botnet" software which enabled him to wiretap communications and access their owners bank accounts.


John Schiefer, 26, will plead guilty to four counts of fraud and wiretap charges, with possible penalties that of up to a $1.75 million fine and prison for up to 60 years, per the Los Angeles U.S. Attorney's office.

Check out the Reuters story "Botmaster" admits infecting 250,000 computers:

Prosecutors said Schiefer and an unspecified number of conspirators installed malicious computer codes that acted as a wiretap on compromised computers and intercepted messages to www.paypal.com and similar Web sites.

He retrieved usernames and passwords and used them to access an unknown number of bank accounts. Prosecutors said they were still investigating how much money was stolen and the number of victims.

They said Schiefer worked by day as an information security consultant but was a well-known "Botmaster" among the underground network of hackers skilled in so-called "botnet attacks."

He is expected to be arraigned on December 3.

You can probably guess, my take, Lock him up and throwaway the key! The biggest problem is that the vast majority of these guys get away totally free, or if they get caught they get a slight slap on the wrist, or even a six figure job in I.T. Security.

Sphere: Related Content

Yahoo! Announces Academic "Cloud" Computing Program

Not to be out done by Google and IBM, Yahoo! announces a similar plan to give students access to Hadoop, an open source distributed computing framework (i.e. Open Source Cloud Computing) of which, Yahoo! has been a key contributor. The Hadoop data center, named M45 after the Pleiades star cluster, is a 4,000-processor cluster supercomputer with 3 terabytes of memory and 1.5 petabytes of diskspace, that's one big Cloud!!

Take a look at the Read/WriteWeb article Yahoo! Announces Distributed Computing Academic Program:

Yahoo! claims that the M45 cluster is one of the top 50 fastest supercomputers in the world, capable of performing at 27 Teraflops.

According to Yahoo!, universities have not had access to the type of hardware and software infrastructure necessary for web-scale distributed computing research. Yahoo! intends for Carnegie Mellon to be first school in a broader academic research partnership program. CMU and Yahoo! also plan to hold a Hadoop Summit in the first half of 2008, to which they say they would invite major Hadoop users such as Facebook and the University of California at Berkeley.

"Yahoo! is dedicated to working with leading universities to solve some of the most critical computing challenges facing our industry," said Ron Brachman, vice president and head of Yahoo! academic relations in a press release. "Launching this program and M45 is a significant milestone in creating a global, collaborative research community working to advance the new sciences of the Internet."

First Google, and IBM and now Yahoo!, it getting pretty clear that "Cloud Computing" and with it massive consolidation of corporate data centers is coming in the not too distant future.

So a word of warning to those who have resisted learning web based programming, continuing to do so, will most likely put you in the same boat as the RPG and PL1 programmers of a decade ago.

Related Post: Google And IBM Pitch In To Improve Computer Science Education

Sphere: Related Content

Saturday, November 10, 2007

Hey Apple, Data loss bug discovered in Leopard! - IBM

Seen the latest Apple commercials with the IBM/Apple guys, the ones where IBM has to admit Vista has had some problems...

Well it looks like Apple may have spoken a little too soon, a blogger has appearently discovered a data loss glitch in Apple's highly touted Mac OS X 10.5 Leopard. The bug results in the loss of data when moving files to or from any external drive, including external hard drives, USB flash drives, SMB file shares, or even a network attached storage. Ouch!

Excerpt from The Register:

According to Tom Karpik, the bug manifests itself when an attempt to move - rather than copy - a folder from the Mac is interrupted. Moving a folder off a disk essentially involves first copying it then deleting the original. Karpik shows that Finder fails to ensure that the folder has been successfully written to the target disk before removing the original.

Read Tom Karpik's post here: Massive data loss bug in Leopard.

Or checkout Tech Republic's:Data loss bug discovered in Leopard

Hey Apple, I guess the saying, "What Goes Around Comes Around", looks like someone besides IBM needs to take another look at their testing procedures.

Sphere: Related Content

Friday, November 9, 2007

Extinction Of Programmers? Not For A While!

Hans-Eric Gronlund has an interesting post speculating on "The Extinction Of Programmers". While it's and interesting premise, I'm not worried just yet. Hans-Eric, quotes another article, The Future of Software Development by Alex Iskold which predicts a future where only a few high quality software engineers will be able to serve the world’s need of computer systems.

Sounds like the early computers scientist that was quoted as saying all the worlds computing could be done one a dozen or so mainframes...

Hans-Eric then proposes a time where a "Super-Compiler" exists that takes in specs and outputs fully functional, fully tested code.

The problem with this, is I find that frequently developing adequate specifications for a project is equally as hard as creating and testing the final code. The "Super-Compiler" would just move the effort a step back, and Developers will be spending their time developing error free specifications instead of error free code.

As we have all experienced, users frequently don't even know what they want, especially if they have never had it before.

Now when the "Super-Compilers" develop the ability to read users minds and foretell the future in addition to outputting functional, tested, bug free code, I'll start worrying.

Sphere: Related Content

Oracle Has A Zero Day Bug In 10g When Will They Fix It?

Looks like the folks over at Oracle have discovered a buffer overflow problem with one of their stored procedures, in Oracle database version 10g Release 2, with patches updated as far as February 2007, but are not in much of a hurry to fix.

Discovered by an anonymous researcher, and then reported it to VeriSign's iDefense Labs, the bug could allow evil doers to load elicit code in unsuspecting systems, however Oracle does not plan on patching the procedure until it's scheduled January 15 security release.

For more on Oracle's latest bug and their causal attitude toward it check out The Register: Zero-day bug hangs over Oracle database, Forget about it and iDefense reports: PUBLIC ADVISORY: 11.07.07

Sphere: Related Content

Wednesday, November 7, 2007

Grassley: H-1B Program Displaces U.S. Workers

United States Senator Chuck Grassley (R-Iowa) continued his battle against the the H-1B visa program and it's supporters this week, charging that H-1B program takes jobs away from American workers. Senator Grassley said that he's willing to consider an increase in the annual visa cap if proponents agree to include reforms in the bill.

Intended to be a temporary program to fill jobs that that otherwise could not be filled with U.S. workers, the H-1B program has become a favorite of the high tech employers. The H-1B program instead of it's intended purpose, has allowed the tech employers to limit labor costs by depressing I.T. salaries nationwide via an increase in the available supply of technical professionals.

By depressing the I.T. salaries, the H-1B program has had an unforeseen effect, depressing the enrollment of students in I.T. training programs in U.S. Universities. Qualified students, seeing that I.T. positions no longer carry the prestige, high demand and salaries they once commanded have chosen more prestigious, higher paying, or less demanding majors.

This unforeseen effect, of decreasing I.T. students has further depressed the supply of I.T. professionals from U.S. schools to the industry, so once again, high tech employers are wanting to increase the number of available workers by substantially upping the number of H-1B positions available.

The changes that Senator Grassley wants to implement, are directed at putting the program back on track, allowing employers to go out and get the labor they need, but in order to do so having to pay a price, in much higher application fees. And then the government will use these increase fees to encourage U.S. students to go into the industry through loans and grants to promising students.

But the high tech industry says that these increased fees will keep them from using program, and therefore are impractical and must not be included in the final bill.

The proposed changes will keep those companies looking to use the H-1B program to control their labor costs instead of actually suppling needed workers, will be discouraged from using the program.

But those companies that really need the technical help will be able to get it, and while doing so, encourage promising students from U.S. Universities to fill those same positions in the future...

Check out what Senator Grassley had to say in Infoworlds:Grassley: H-1B visas being used to displace U.S. workers:

The H-1B program was "intended to fill jobs for a temporary amount of time while the country invested in American workers to pick up the skills they needed," Grassley said in the Senate on Monday, according to a transcript of his remarks. "Unfortunately, the H-1B program is so popular that it's now replacing the U.S. labor force."

Grassley, who has been a vocal H-1B critic, also cited a string of "bad apple" practices associated with the visa program. "One of my constituents has shared copies of e-mails showing how he's often bombarded with requests by companies who want to lease their H-1B workers to him," the senator said, referring to visa holders who are awaiting a work assignment -- or on the bench, in industry parlance.

"Another constituent sent me a letter saying that he saw firsthand how foreign workers were brought in while Iowans with similar qualifications were let go," Grassley said. "He tells me that he is a computer professional with more than 20 years of experience. He was laid off and has yet to find a job."

Grassley and Sen. Dick Durbin (D-Ill.) pushed for H-1B reforms in legislation that was introduced last spring and then folded into a comprehensive immigration reform bill. However, the latter measure failed to win approval in the Senate.

They also have been gathering data on H-1B usage, including a finding that nine IT offshoring companies based overseas, most of them in India, used about 20,000 visas during the federal government's 2006 fiscal year.

Among the H-1B reforms sought by Grassley and Durbin are rules requiring employers to attest that they aren't displacing U.S. workers by hiring H­1B visa holders and that they have taken "good-faith steps" to hire American citizens. The reform proposals put forward by the two senators would also give the U.S. Department of Labor more investigative powers and hike penalties for noncompliance, among other changes.

In addition, Grassley is seeking an increase in the H-1B application fee from $1,500 to $5,000. The U.S. uses the visa fees to fund scholarships for science, engineering and technology students.

"I'm willing to consider an increase in H-1B visa supply, but only if reforms are included," he said this week.

The immigration reform bill that was quashed by the Senate in June would have increased the current annual cap of 85,000 visas, which includes 20,000 that are set aside for foreign workers with advanced degrees from U.S. universities, to an all-inclusive limit of 115,000. Since the bill was rejected, there have been ongoing efforts to raise the H-1B cap in other pieces of legislation.

The push to increase the cap is also coming from groups outside of the IT industry, such as ones from the health care market. But high-tech companies typically have been the largest users of the visas.

Compete America, a Washington-based lobbying group that is spearheading a coalition of IT vendors and universities, is urging Congress to act on a cap increase bill before year's end. Robert Hoffman, an Oracle executive who is co-chair of Compete America, said the group has "always remained open to providing reforms of the program." But he argued that some of reforms Grassley is seeking would make the H-1B program too costly for U.S. firms to administer.

"We have said this from Day One: All things being equal, of course, we are going to hire the American worker," Hoffman said. "It's cheaper to hire an American worker." He added that companies need to employ support teams of administrators and attorneys in order to bring H-1B workers aboard.


As I.T. professionals we need support Senators Grassley and Durbin in their efforts to fix the broken H-1B program. One suggestion I wish to make, is to broaden the recipient pool of the increased application fees to retraining programs for I.T. professions who need to update their skills, but find that their technical employers would rather hirer H-1B candidates who all ready have the new skills than to retrain their existing employees.

Sphere: Related Content