Monday, October 29, 2007

Malicious Software Is the Real Pandemic

Looks like Microsoft released its Malicious Software Removal Tool a couple of years ago, with it they included a way to track the number and types of Malicious Software that the hackers are releasing upon us.

And the Malicious Software Removal Tool has uncovered some alarming statistics, for the first half of 2007 it detected 31.6 million phishing scams, an increase of more than 150% over the last half of 2006. In addition a 500% increase in Trojan downloaders and droppers, software that installs Trojans, password stealers, keyboard loggers and/or other malware on victims' systems. Also increasing, a growing number of backdoors—a category that includes bots and that the company referred to as an increasing threat to instant messaging users.

In his keynote speach at RSA Conference Europe in London on Oct. 23, Ben Fathi, corporate vice president of development for the Windows Core Operating System Division at Microsoft, presented these results of the Microsoft's latest bi-annual survey.

Brendon Lynch, Microsoft's director of privacy strategy, told eWEEK in an interview called Attacks Aimed at Personal Data Soar, that the findings just confirm what security professionals have been realizing for some time:

"Critical information is a very valuable resource, for the businesses that use it to the consumers [who] get personalized services, and also [it's] valuable to criminals—it's the currency of crime online," he said. "That's what they're targeting, to hijack accounts with fraud and the like."

Microsoft found that the large increase in personal data thievery—including backdoors/bots/password stealers/keyloggers—during the first half of 2007 was due almost exclusively to one family of malware, Win32/IRCbot, which accounted for 81 percent of all backdoor detections in Windows Live Messenger. Trojan interception rate also surged, with MSRT picking up 5.9 million downloaders and droppers, up from 960,000 in the previous six months.

How much of this is due to increasing malware and how much is due to Microsoft refining the detection capabilities of MSRT is up for debate.

"The quality of detection is going to of course affect these numbers," Lynch said. "As we continue to improve the quality [of the tool's capabilities] there's [going to be more threats detected]. We don't believe that's the whole thing, though."

Rich Mogull, an independent security consultant, founder of Securosis LLC and former Gartner analyst, said that Microsoft is, in fact, getting better at detection, but there "really is more phishing."

"It also doesn't matter," he said in an e-mail exchange. "It's so bad now that those that can't filter and manage it to a reasonable level can't effectively use e-mail anymore anyway."

As far as the 500 percent increase in Trojans goes, Mogull has faith in Microsoft's figures. "We do know that Trojan and other malware kits are far more available and in active use," he said. "I strongly suspect that the 500 percent increase is a combination of more malware and better detection. … MSRT definitely needed to improve detection, but most of the stats I've seen are supporting big increases, if not at the levels MSRT is reporting."

MSRT removed malware from one out of every 217 computers it scanned in early 2007, compared with one out of 409 in late 2006 and one out of 359 in the second half of 2005. Microsoft said the increases are likely due to improvements in MSRT along with the addition of highly prevalent families, such as Win32/Renos, Win32/Stration and Win32/Alureon. Some families MSRT has detected—Win32/Hupigon, Win32/Bancos and Win32/Banker—target data theft and banking information.

The report also noted that the number of new software vulnerabilities remains large, with 3,400 new zero-days disclosed in the first half of the year. That number was down from 2006, however, which marks the first-ever decline in total vulnerabilities since 2003.

Classic e-mail worms continue to shrink in significance: They dropped to 49 percent of total malware detected in e-mail. Phishing scams increased, however, from 12.4 million in the second half of 2006 to 31.6 million in the first half of 2007. E-mail containing malicious iFrame attacks accounted for 27 percent of e-mail malware in the second half of 2006, rising to 37 percent six months later.

Microsoft found that machines running Vista and Windows XP SP2 had "significantly" lower infection rates than older Windows operating systems.

Lynch said that this was due to stronger security controls and features—for example, Vista asks users to OK downloads—as opposed to the fact that Vista isn't widely adopted yet and therefore still hasn't been as attractive a focus for malware authors.

"[The comparison was] done on a comparable basis, so it was like for like," he said.

As I have said repeatably in this blog, the time has come government to step forward and write some comprehensive legislation to protect the American public from these security breaches, if not the day will come soon where these attacks will effect all of us.

Sphere: Related Content

Saturday, October 27, 2007

Data Centers Account For 15 % Of I.T. Budgets, Google Says Their Future is Cloudy

In a new article on research from the the Uptime Institute, data center costs have risen from 1 to 3 % of I.T. budgets to between 5 and 15% and that is attracting industry attention. The article from PC World called Data Centers Devour 5 to 15 Percent of IT Budget says:

That's enough "to threaten the economic productivity of enterprise IT," says Ken Brill, executive director of the Uptime Institute.

Many standard total cost of ownership (TCO) calculations around data centers fall short on both capital and operating expenses, underestimating factors such as power and cooling, according to the research.

In conjunction with the new research, the institute is offering a downloadable TCO calculator for IT execs. The tool, designed to examine long-term costs and possible tradeoffs, and the supporting research were done by Jonathan Koomey, PhD, a senior fellow of the Uptime Institute, and a Lawrence Berkeley National Laboratory Project Scientist and Stanford University consulting professor.

What can be done, Google thinks it knows, after all they have to something really big with all those new engineers they are hiring. The industry term for their solution is "Cloud Computing" and according to Nicholas Carr in his Rough Type post Google's cloud:

Google not only wants to hold all our data; it also wants to host all our applications. That, according to Robert Cringely, is the secret meaning behind the recent announcement that Google would be contributing code to MySQL, the popular open-source database. "While Google has long been able to mess with the MySQL code in ITS machines," writes Cringely, "it hasn't been able to mess with the code in YOUR machine and now it wants to do exactly that." Why? Because "Google wants us to embrace not just cloud computing but Google's version of cloud computing, the hooks for which will be in every modern operating system by mid-2009, spread not by Google but by a trusted open source vendor, MySQL."

Mid-2009 is when the new version of MySQL, with the Google hooks, will appear, says Cringely. It's also, he argues, when Google's global data center network will be complete.

THAT's when things will get really interesting. Imagine a much more user-friendly version of Amazon's EC2 and S3 services, only spread across 10 or more times as many machines. And as with all its services, Google will offer free versions at the bottom for consumers and paid, but still cost-effective versions nearer the top for businesses and education.

Google's goal here is to help us, of course, but along the way the company will have marginalized most higher-end computing vendors, especially Microsoft. They will have also made us totally dependent on Google services in such a way that we'll never, ever, be able to extricate ourselves. We'll be slaves, but happy slaves, and Google will come to dominate all computing for the next generation. Take the $100+ billion that U.S. industry currently spends each year on data center-based computing, cut that price in half and send it straight to the Googleplex.

Far-fetched? A bit. But sometimes you have to stretch to see the future. As Google made clear earlier this week, it's plotting an aggressive move into the delivery of applications, for businesses as well as consumers. Said CEO Eric Schmidt: “Most people who run small businesses would like to throw out their infrastructure and use ours for $50 per year." But it's not just small businesses that Google's targeting. The company also announced it's running Google Apps pilots in a couple of dozen Fortune 1000 companies, and in that context the primary role of Google Apps may be as a Trojan Horse that gets big companies used to the idea of running their apps on the world's most efficient supercomputer. Truth be told, most big companies wouldn't hesitate to throw out their IT infrastructure if a cheaper, more flexible and more secure alternative was available. Comments Schmidt, drily: "It looks like it can get pretty big."

So when can we switch off our Data Centers and move all their functionality to Google. It won't be overnight but Google is definably headed that direction. Google has gone a long way toward making interactive and modular web based applications. Their applications can be interlinked and enhanced. Google Widgets in conjunction with Google Gears may well be the future internal applications, with all (or most) data and processing handled by Google's Cloud. And as the story says the difference is that instead of paying for your own data center, you be cutting a check to Google.

Sphere: Related Content

Is Outsourcing, Insourcing if it is in America

Fighting a weakening dollar and ever increasing programming salaries overseas, outsource development firms have are having a harder time making ends meet so in order to combat timezone, communication and budget problem the firms are looking to the Americas for solutions. PC World has written a interesting article on the subject called Outsourcing Comes to the Americas:

Sometimes the grass is greener ... in your own backyard. That's the message delivered in the report "Time Zones Do Matter: Rediscovering the Americas and Nearshore Delivery" by Dana Stiffler, research director for AMR Research.

Read the Report For more on what this means for U.S.-based IT organizations, see a PDF version of the AMR Research report here.

By 2015, India will be just one hub in a vast network from which IT organizations will deliver technology and business skills, according to the report. "Most processes are better candidates for nearshore and remote domestic delivery than they are for 100 percent onsite or 100 percent offshore," says Stiffler.

In response to what Stiffler describes as "India and China fatigue," businesses are seeking relief from the staff attrition and rising salaries that have become the norm in Asia and looking to Latin America and less expensive regions of the U.S. as an increasingly attractive option for IT services delivery.

As the rupee has strengthened against an already weak dollar, Latin American currencies have gained little vis-`-vis the greenback, making that market even more attractive. Large service providers targeting the U.S. market have already expanded into Latin America, with the most popular destinations being Brazil, and more recently, Mexico. U.S. and Indian firms are also opening up new service centers in the Cono Sur countries (the Southern Cone of South America, which consists of Argentina, Chile and Uruguay).

Service providers-and their customers-are also "rediscovering" the good old U.S. of A. Traditional IT services companies like Accenture, BearingPoint and CapGemini are setting up shop in smaller, less expensive areas like Kansas City; Hattiesburg, Miss.; and Tulsa, Okla. Meanwhile, Indian outsourcers are aggressively building and buying throughout North America.

"Delivery of business and IT services from Latin America and the United States will not supplant what's now being done in India or China," Stiffler says. "It simply enlarges the potential pool of processes and technologies that can be addressed."

All this as the whole outsourcing movement in the States is in the state of decline, 16% this year alone, as discussed in U.S. Demand for Outsourcing Slows Down:
The total contract value of outsourcing contracts signed in the third quarter of 2007 was down 16 percent, with the actual value of the contracts signed shrinking as well, according to outsourcing adviser TPI. At the heart of the decline: the slowing pace of contract awards in the U.S. TPI's numbers show that U.S. companies are also keeping a lid on outsourcing growth, with new scope down 50 percent from last year.

Conversely, Europe and Asia are showing growth in outsourcing deals year over year, with Europe accounting for more than a 50 percent share of global market deals. New scope is up 36 percent in Europe and 72 percent in Asia Pacific, according to TPI. Competitor EquaTerra also found that outsourcing growth was strongest in the Europe/Middle East/Africa geography.

Almost as many Global 500 companies are inking outsourcing deals in the U.S. and Europe (43 percent of leading U.S. companies and 52 percent of leading European companies), says TPI. It's just that the American deals are smaller.

Mega-deals-those once popular billion-dollar-plus behemoths-are still getting signed (by General Motors, Johnson & Johnson, Credit Suisse, Reuters and the U.K. Post Office, among others). They're just getting less "mega." The average size of the billion-plus contract in the first quarter of last year was US$9.6 billion. In the third quarter of 2007, it was down to $2.4 billion, TPI reports.

Major India-based vendors have seen their U.S. customer revenue increase 37 percent, despite the slowdown in overall outsourcing in the Americas, says TPI, adding that "the latter exemplifies the diversity in the global outsourcing industry as well as India's expanding influence and strength." Meanwhile, EquaTerra's third-quarter survey revealed increasing interest in offshoring outside of India. Wage inflation, U.S. dollar weakness and changing buyer demands are driving the expansion of delivery centers in China, Central and South America, and Central and Eastern Europe, EquaTerra notes.

Business process outsourcing (BPO) analyst firm Nelson Hall says the BPO market is hot, with year-over-year total contract value growth of 26 percent in the third quarter and 54 percent growth during the first nine months of the year. TPI characterized global BPO growth as "sluggish." But both TPI and Nelson Hall's reports agreed that BPO contract activity was strongest in North America.

I still believe that American Innovation will more that make up for any cost advantage Off Shoring development will give over the long run, particularly when you consider wage differences moderate for off shore countries as wage competition takes hold.

Sphere: Related Content

Wednesday, October 17, 2007

Who's Who in Information Technology (I.T.) Security

You've read the stories and your ready to improve your companies computer security, who do you talk to? Well for this year, Infoworld has published a list of the top 10 upcoming computer security firms in 10 IT security companies to watch, I'm only going to publish the names, read their article for more complete information.

2Factor
Founded: 2006
Headquarters: Maumee, Ohio
Funding: $1.6 million in first-round financing
CEO: David Burns

NetWitness
Founded: 2006
Headquarters: Herndon, Va.
Funding: $7.5 million from undisclosed angel investors
CEO: Amit Yoran

PaloAlto Networks
Founded: 2005
Headquarters: Alviso, Calif.
Funding: $28 million from Globespan Capital Partners, Greylock Partners and Sequoia Capital
CEO: Dave Stevens

Provilla
Founded: February 2005
Headquarters: Mountain View, Calif.
Funding: $10 million in private funding; investors include Hitachi Systems
CEO: Antonio Espinosa

Robot Genius
Founded: 2005
Headquarters: Oakland, Calif.
Funding: $2 million from Kingdon Capital and Venio Capital Partners
CEO: Stephen Hsu

SailPoint
Founded: 2005
Headquarters: Austin
Funding: $14 million from venture capital firms including Austin Ventures, Lightspeed Venture Partners, Origin Partners and Silverton Partners
CEO: Mark McClain

Sentrigo
Founded: 2006
Headquarters: Kfar Saba, Israel; U.S. office in Woburn, Mass.
Funding: $3.5 million from Benchmark Capital
CEO: Nathan Shuchami

Venafi
Founded: 2004
Headquarters: Salt Lake City
Funding: $20 million in venture capital from Foundation Capital, Origin Partners, and UV Partners.
CEO: Trell Rohovit

Veracode
Founded: 2007
Headquarters: Burlington, Mass.
Funding: $19.5 million from venture capital firms 406 Ventures, Atlas Venture and Polaris Venture Partners
CEO: Former Symantec executive Matt Moynahan

WebLOQ
Founded: January 2004 (in stealth mode until the service launched in September)
Headquarters: Monterey, Calif.
Funding: More than $3 million from high-net-worth individuals, no venture capital
CEO: Neal Smith

Now take the next step and contact your choice to enhance company security.

Sphere: Related Content

Business Software Alliance Joins Call For Federal I.T. Security

What do you know, I'm not the only one calling for new Federal Legislation concerning securing our I.T. systems and data, a group called the Business Software Alliance (BSA)is calling for the federal government to address cybercrime and increased enforcement. Infoworld reports in Tech group calls for new cybersecurity legislation
that :

Members of the BSA, a trade group based in Washington, D.C., on Monday asked Congress to pass the Cyber-Security Enhancement Act, which would expand the computer crimes statute in federal law to include the stealing of access codes or electronic identifiers from a computer. The bill would also make it a crime to access a computer without authorization, even if the access does not cause damage, and it would define a new crime of conspiracy to commit cybercrime.

U.S. computers have "never been so vulnerable to attack," said Art Coviello, president of the RSA security division of EMC. In many cases, the vulnerabilities come from companies and individuals needing to share more and more information with others, "without understanding the risks," he said.

Companies and individual computer users need to rethink the way they address security, said Coviello, speaking at a BSA forum. Computer users need to reject popular beliefs that security can be bolted on to software after it's developed and that security can be accomplished with a perimeter defense, he said.

Cybersecurity needs to become more granular, and organizations must begin to prioritize what information they need to keep most safe, Coviello said. "Security needs to adapt to facts and circumstances," he said.

Coviello criticized Congress, saying it only focuses on cybersecurity for a short time each year, when U.S. agency cybersecurity grades come out. In the following weeks, some members of Congress will get up in arms about all the bad grades, then forget about the issue, he said.

But what U.S. agencies need is funding for cybersecurity efforts, he said.

"Give money, not lectures," Coviello said. "Then you can hold people accountable."

The Cyber-Security Enhancement Act, introduced in May, would also give an additional $10 million a year to three U.S. government agencies that fight cybercrime.

Representative Steve Chabot, an Ohio Republican and co-sponsor of the bill, told the BSA that more laws are needed to fight organized cybercrime.

"The rise in the number of sophisticated cybercrimes ... shows that we need to do more to protect individuals and businesses," Chabot said. "These cyberattacks are becoming increasingly sophisticated."

We all need to do our part in letting our Congressmen know that we need to do this, before the cyberattacts and data thefts go beyond serious and enter the realm of life threating.

Speaking of Cyberattacks, following up on my post on the investigation of the Department of Defense cyberattack the C/Net is providing additional support for the need for tougher security standards in Will cyberintrusions crash U.S. electrical grid?

Some critics of the U.S. government's cybersecurity efforts might argue that nothing short of a bomb going off--or, well, purported Chinese cyberattacks on feds' machines--will land the issue more notice.

Without tougher security standards, Americans are in danger of hacker-induced blackouts, some politicians say.

This time around, the wake-up call for politicians was, indeed, an explosion: In September, U.S. Homeland Security officials revealed that researchers at the Idaho National Laboratory had managed to destroy a small electrical generator through a simulated cyberattack. A few weeks ago, CNN aired a gloom-and-doom segment featuring snips from the once-classified video showing the device going up in smoke.

Although the prospect of that sort of incident causing massive disruption to the U.S. electrical grid has been around for years, the success of the experimental hack is drawing new calls from Congress for tougher federal security standards on the computer systems that control the nation's power systems.

"I'll be blunt--if this administration doesn't recognize and prioritize these problems soon, the future isn't going to be pretty," said Rep. Jim Langevin (D-R.I.), chairman of a House of Representatives cybersecurity panel that convened a hearing here on the topic Wednesday afternoon.

It's widely agreed that the threats to so-called "control" systems--sometimes known by the acronym SCADA, short for "Supervisory Control And Data Acquisition"--have grown in recent years. That's because more and more of them are being hooked up to "open" networks, including corporate intranets and the Internet, in an effort by their owners and operators to improve efficiency and lower costs.

But there was never much focus on the idea of building security features into those systems when they were first created, and that trend, unfortunately, continues today, said Joseph Weiss, a consultant and nuclear engineer who spent more than 30 years designing, implementing and analyzing control systems.

Feds: We're on it
Government regulators, for their part, say they are growing increasingly aware of those shortcomings and working valiantly to address the problem. Homeland Security's cybersecurity czar, Greg Garcia, told politicians Wednesday that his agency is handing out cybersecurity self-assessment guidelines to control systems operators, offering training to workers in that sphere, and distributing recommended "mitigations" against real-world attacks like the one simulated in Idaho.

And right now, the Federal Energy Regulatory Commission (FERC), which is responsible for overseeing the reliability of the nation's power systems, is considering proposed rules that purport to strengthen cybersecurity standards for the nation's power systems.

That proposal, however, falls woefully short of offering sufficient protections, Langevin and his Democratic and Republican colleagues said in comments filed recently with FERC. One major problem: The proposed rules are written in such a way that they would not even require electric grid operators and owners to install comprehensive security measures on all critical pieces of their systems that, if compromised, could cause significant disruptions, they argued. Instead, they'd have some latitude to focus only on certain components and neglect others.

The politicians are urging FERC to incorporate some of the more comprehensive, stringent standards developed by the National Institute of Standards and Technology, which is considered home to the government's technical experts.

Weiss, the consultant, argued that the infamous blackout that pummeled the Northeast in August 2003 (and was reportedly linked to the so-called MSBlast worm) arguably wouldn't have been prevented by the proposed regulations, but the NIST rules are comprehensive enough to deal with that issue.

Some suggested that the rules may not be up to par because, as required by law, they were devised chiefly by a group called the North American Electric Reliability Corporation (NERC), which was long considered the trade association for the power industry and was recently given legal authority to propose regulations for federal regulators to approve. An entity with those potential conflicts of interest isn't necessarily well-positioned to come up with objective standards, and it's high time for Congress to create a more independent means of devising critically important cybersecurity rules, Weiss said.

Rep. Zoe Lofgren (D-Calif.) appeared sympathetic to that idea and suggested that Homeland Security's cybersecurity division should be granted more authority to help out. "I don't think the energy sector is necessarily the expert on cybersecurity," she said.

NERC Executive Vice President David Whiteley said his organization was open to revising the proposed rules, while Joseph McClelland, director of FERC's Office of Electric Reliability, acknowledged that further improvements should be made before the rules gain final approval.

Although the electric grid was the primary focus Wednesday, threats to the control systems that deal with myriad other types of utilities could also prove, how shall we say, messy.

I'm glad some one is starting to pay attention to this vital topic, now get out there and spread the word, or we'll be worrying about a lot more than who has our SSN's.

Sphere: Related Content

Gartner, Hils, Budget I.T. Security Tips

ZDNet has a post, directly from the Gardner Symposium last week called Tipsheet: Information security on the cheap In it they talk about simple inexpensive measures that can be done to improve security, and what they have the say is well worth the time spent reading it. Not everything we do needs to be expensive to be effective, take a look:


These tips were culled from a presentation last week by Adam Hils, who gave a talk at the Gartner Symposium/ITxpo last week. The tips were targeted toward midmarket companies.

Security steps that’ll cost almost nothing:

* Evaluate the patch status of all production systems connected to general-purpose networks.
* Deploy standard configurations on your PCs and servers to reduce vulnerabilities and improve patch deployment success rates.
* Look for ways to standardize your PC environment by classifying users by their need to manage their own PCs. P.S. Locking down everyone doesn’t work.
* Negotiate more features with your desktop security vendor when you renew.
* Make sure you have anti-spyware and personal firewalls–and don’t pay for them.
* Block all attachments from outside except for those used in business (.zip, .doc, .xls, .pdf and .ppt).
* Limit administrator privileges to administrators.
* Don’t allow critical files (customer and employee records and intellectual property) to be printed or downloaded. Any exceptions should be documented and justified.
* Erase all data on the hard drive before recycling or throwing away a PC.
* Disable all inactive accounts.

Inexpensive security steps:

* Minimize benign data leakage by instructing employees of best practices.
* Change passwords on root and administrator accounts; review help desk and password resets.
* Restrict access to USB/removable media points where possible. Apply policy restrictions on others.
* Examine security practices for remote access.
* Block every port that your business does not require to be open.
* Use compliance as a rationale for more security funding.
* Narrow the vendor list. The less vendors you have the more leverage you get.
* Consider security delivered through SaaS, all in one appliances, open source and thin client computing.
* Don’t spend money on things you don’t need. Examples include: Personal digital certificates, 500-page security policies, security awareness posters, biometrics and passive intrusion detection.
They say "An ounce of prevention is worth a pound of cure", while I say in todays digital world I would say the saying should be "An ounce of prevention is worth a ton of cure". Meaning just a little bit time spent in forethought, and planning can save spending days and weeks in recovery, when it comes to computer security.

Sphere: Related Content

Monday, October 15, 2007

Gartner Official Agrees, Breaches Bigger, More Expensive

I really do wish I wasn't right about this, but now Gartner is agreeing with me, data breaches will be bigger and more expensive for the next few years. Check out the post in PC World, Data Breaches are Getting More Expensive, Analyst Says:

Financially motivated data breaches are set to cost businesses 20 percent more each year until 2009, according to Gartner.

John Pescatore, VP at Gartner, said the biggest risk to organizations came from targeted attacks. He said that "phishing and identity theft attacks have caused the rise of 'credentialed' attacks, in which the attacker uses the credentials of a legitimate user."

Malicious software attacks allowed internal executables to be used to forward information to an external attacker, Pescatore warned. "Being aware of 'inside out' communications and being able to block those as effectively as 'outside in' is becoming increasingly important," he said.

It was important to make sure that security strategies reduced the cost of dealing with mass attacks, Gartner advised, in order to free up budgets for the next generation of security attacks.

The analyst group reckons the average business is spending more than 5 percent of its IT budget on security, and another 7 percent on disaster recovery.

But it said 90 percent of targeted attacks could be avoided without an increase in firms' security budgets, and said the investments that enterprises had made in intrusion prevention, vulnerability management and network access control had largely paid off.

At the same time, however, it warned that there was currently little or no correlation between organizations that spent the most on security and those that are most protected, it said.

It said the most effective way to become increase the efficiency of security spending was to avoid vulnerabilities by ensuring that security was a top requirement for every new application, process and product. It was also important to establish security metrics to measure spending efficiency, it added.
Unfortunately, in order to address this issue Government officials will need to be come much more I.T. savvy or at least get aid from computer savvy consultants. As I have advocated in the past, a comprehensive data protection plan needs to be put in place with real teeth/instinctive in order to make businesses take the time and treasure required to put in place the safeguards needed.

Sphere: Related Content

Thursday, October 11, 2007

District Judge Charles Breyer Blocks Employer SSN Check

In answer to all who have wondered why the United States hasn't used existing laws to insure that companies only hire legal citizens. It's the Judges that permeate the lower levels of the Judiciary.

The Washington Post is reporting in Effort to Curb Illegal Workers' Hiring Blocked, Administration efforts to enforce the existing law that requires employers to check on the validity of the employees Social Security Numbers has been blocked by U.S. District Judge Charles R. Breyer of the infamous Northern District of California. Judge Breyer is the brother of Justice Stephen G. Breyer of the Supreme Court and was nominated by President Bill Clinton in 1997.

What is it going to take to prove to these Judges who is legal and who isn't. In the brief written to go with decision, the reason the Judge made the decision was that the Social Security Database contains incorrect information. Just how incorrect is it? I'll bet that 98+% of all the information contained in it, is correct, otherwise the SSN database managers are truly negligent and need to be dealt with immediately.

What troubles me about this decision is the SSN is not the only piece of information contained in a Social Security record that could be checked to see if the SSN is valid. If the card presenter is a 23 year male in California, and the SSN belongs to a 72 year old female, from Maine and has been actively used since 1949 then there is a real good probability that the 23 year old isn't using the correct SSN. If one or two of the parts of the SSN record don't match then it could be bad data, if 10 or 12 parts of the SSN record don't match then the card presenter is illegal.

As someone responsible for detecting and repairing incorrect information in databases as a profession, do I believe that there is bad data in the SSN database? I would answer, Yes, of course there is. Is this unusual in all large databases? I would answer, NO, of course not.

Computers are not perfect, and as long as humans are responsible for inputing the data, then some of the data will be inaccurate. The trick is to make it as accurate as possible, and efforts need to be made to make it so. But if perfection is the standard, it will never be achieved, and that goes for Banks, Stock Brokerages, the Social Security Administration and the IRS.

Just because some of the data is incorrect, you can't disregard a database in its entirety, otherwise, I can assure you our economy would come to a screeching halt.

This decision illustrates the need for fair and reasonable judges to be appointed to the bench. And hopefully this decision will rapidly be overturned by a higher court,

Sphere: Related Content

Wednesday, October 10, 2007

Google And IBM Pitch In To Improve Computer Science Education

Unlike Microsoft, who would rather complain about the state of Computer Science education in the country, we hear about efforts by Google and IBM to give the countries Computer Science Students experinece in cutting edge large-scale distributed computing. O'Reilly Radar reports in Google & IBM giving students a distributed systems lab using Hadoop that:

Google & IBM have partnered to give university students hands-on experience developing software for large-scale distributed systems. This initiative focuses on parallel processing for large data sets using Hadoop, an open source implementation of Google's MapReduce. (See Tim's earlier post about Yahoo & Hadoop)
“The goal of this initiative is to improve computer science students’ knowledge of highly parallel computing practices to better address the emerging paradigm of large-scale distributed computing. IBM and Google are teaming up to provide hardware, software and services to augment university curricula and expand research horizons. With their combined resources, the companies hope to lower the financial and logistical barriers for the academic community to explore this emerging model of computing.”

The project currently includes the University of Washington, Carnegie-Mellon University, MIT, Stanford, UC Berkeley and the University of Maryland. Students in participating classes will have access to a dedicated cluster of "several hundred computers" running Linux under XEN virtualization. The project is expected to expand to thousands of processors and eventually be open to researchers and students at other institutions.

As part of this effort, Google and the University of Washington have released a Creative Commons licensed curriculum to help teach distributed systems concepts and techniques. IBM is also providing Hadoop plug-ins for Eclipse.

I applaud their efforts, and what they are doing will no doubt help. But something needs to be done to restore the prestige of the Computer Science industry in the eyes of America's students. Government, as well as the big I.T. employers need to work together to promote Computer Science to the best and brightest, and then work to get them secure high paying positions in line with the level of effort it has taken to get there, as well as the efforts that they need to expend to keep current in the field.

Sphere: Related Content

Saturday, October 6, 2007

Crazy 9th Circuit Court Blocks JPL Secuity Efforts

I can't resist, one more security article. I've been giving the government and military officials a hard time lately for all the security problems they have been having. And just when I thought government I.T. security was nonexistent, up pops a company trying to do the right think and running into opposition from our own government.

Reported by Jason Song, of Los Angeles Times in his article, Court bars new background checks for JPL workers. Jason reports:

The U.S. 9th Circuit Court of Appeals on Friday issued a temporary injunction blocking a federal government directive that would require new background checks for employees at NASA's Jet Propulsion Laboratory.

A group of 28 JPL scientists to stop the investigations, which they said amounted to a blank check for the federal government to look into such areas as their sexual orientation and consumer histories.

The employees had been facing the potential loss of their jobs unless they met a Friday deadline to comply with the directive by filling out questionnaires and signing a waiver allowing the investigations.

A U.S. District Court had upheld the background checks Wednesday.

"This ruling shows we're not going to let hysterical fear and innuendo undermine the Constitution," said attorney Dan Stormer, who represented the JPL scientists.

Calls made to JPL's La CaƱada Flintridge offices were not returned Friday. But NASA Administrator Michael Griffin said in June that the background checks were part of a post-Sept. 11 security update and that the agency was intent on carrying them out.

In its order, the 9th Circuit judges wrote that they had not had time to review all of the case material but noted that the JPL employees had raised serious legal and constitutional issues and showed the "probability of irreparable harm."
JPL has tried to do the right thing, in requiring background checks for all their workers. After all this is the Nation's Jet Propulsion Lab not the neighborhood McDonalds, JPL workers deal on a day to day basis with some of the nation's most sensitive secrets. How could even the nations most liberal Federal Circuit Court block JPL's efforts to check the backgrounds of their employees.

Los Alamos could have learned a thing or two from JPL on this point when they endangered the nations Nuclear secrets when they didn't check the background of some of their employees closely enough.

Sphere: Related Content

Forrester Research Studies Web 2.0 Security Threats

What's with all these security problems lately, have the Chinese thrown in with al-Qaeda in an attempt to bring the Internet down once and for all. With all the security problems reported this week, it is starting to look that way.

A Forrester Research study as found that nearly all enterprises adopt Web 2.0 technologies without preparing to fend off the related threats. They found that while 96% of the 153 I.T. professionals interviewed have eagerly adopted the Web 2.0 technologies, 90% of them are "very concerned" about the security threats Web 2.0 has brought with it.

"Today, the Internet is beleaguered with threats such as phishing, viruses, spyware, and botnets, all threatening to challenge your business operations," said Chenxi Wang, a principal analyst with Forrester, in a written statement. "Forrester's study ... reveals that most companies are slow to respond to the latest threats, or aren't sure what to do to adequately secure. We have found that most companies that have implemented any kind of Web protection have only installed URL filtering and signature scanning. Yet, malware writers are now using the Web as a primary vehicle to propagate a plethora of new threats undeterred by traditional security means. The need for more effective Web protection has never been greater."

The concept of customer-generated content that Web 2.0 offers, gives malicious hackers just enough access to enterprise networks to infect them with malicious code and in turn infect the users of the social networking features. In a recent occurrence, a single worm planted in an MySpace page infected more than 1 million users.

I'm not going into another rant concerning what needs to happen next, something needs to be done and soon.

Sphere: Related Content

How Secure Are Our Government Websites?

More on the Security Problem that knocked *.ca.gov off the Internet Wednesday afternoon.

Evidently the security problem was larger than first reported. PC World is reporting in Danger Lurks on Government Web Sites in addition to the websites of Madera County, California, Brookhaven National Laboratory websites were also compromised. And as of Thursday morning both Brookhaven and Superior Court of Madera County, California websites were still pointing to inappropriate content. The article continues:

Brookhaven has begun an investigation into the incident, said Tom Schlagel, a manager with the lab's information technology division. "From what I've been told, there isn't any evidence that there's any pornography on the server," he said. "It's all just redirections."

Brookhaven is a U.S. Department of Energy lab that specializes in nuclear and high-energy research.

The security of U.S. government Web sites has been front-page news in California this week after the U.S. General Services Administration, which administers the .gov top-level domain, temporarily removed California's state servers from the Internet's Domain Name System (DNS) infrastructure, apparently because of a security problem on the Web site of a small state agency, the Transportation Authority of Marin.

Observers said that this was an unusually extreme move and one that eventually would have knocked the entire state off the Internet.

The move was caught before it caused widespread service outages within the state government, but it drew attention to an underlying issue: compromised governmental Web sites.

The GSA on Thursday said it has revised its policies to avoid another possible statewide shutdown, but it defended its right to remove .gov sites from the Internet. "The potential exposure of pornographic material to the citizens -- and tens of thousands of children -- in California was a primary motivator for GSA to request immediate corrective action," a GSA spokeswoman said in an e-mailed statement. "Also, in these days of heightened security concerns from hackers, it is important to quickly stop potentially harmful damage to federal, state and local Web sites from those who have no love for our country."

Alex Eckelberry, the president of Sunbelt Software Inc. who first reported the problem that led to the California shutdown, said that the government could do better. "Once you're on the Web, especially if you're government, you really have to be responsible for your content," he said. "We have to have some sort of recognition that there is constant danger and that people need to stay on top of their sites."

However, educational sites, hosted within in the .edu top-level domain, have far more problems than the .gov sites, Eckelberry said.

Security professionals like Eckelberry complain that poor response from Web site administrators means that even when problems are discovered by outside researchers, it's often hard to report them.

For example, the Madera Superior Court site has a "Click Here To E-mail Our Webmaster" link on the bottom of its front page, but when Trend Micro Inc. Network Architect Paul Ferguson used it to inform the court that its site had been hacked, his e-mail bounced back as undeliverable. Madera Superior Court representatives did not return a call seeking comment.

"It is almost impossible for someone like, say, a security researcher to find the right person to report problems to," Ferguson said via e-mail. People outside of government cannot do "Whois" queries on the .gov domain, which would yield contact information for site administrators, he added. "In many cases, either the contact information is incorrect, nonexistent, or the 'lights are on and nobody is home'."

"Everyone has really got to do a better job on securing the Internet," Ferguson added. "You can't just put a Web server out there and forget about it any more."
Security problems with both data thefts and security breaches are going to continue to get worse as technology continues to allow us access to more and more data at our finger tips.

It is time for governments, on local, state and national levels to learn about the current risks, what the future risks are, and finally what our nation's security experts suggest to solve the problems.

If the government officials can't understand the complexities then they need to seek help in understanding them, and in helping the officials draft legitimate security laws that will be effective when implemented. Any new laws governing the industry will need to have teeth if there is any hope to bringing these problems under control.

Sphere: Related Content

Friday, October 5, 2007

The Feds Didn't Listen, Now *.ca.gov Disappeared

One week after I warned that U.S. government officials need to take serious look at improving their security procedures in FBI Investigating Unisys For Not Preventing Hacker Intrusion and Gap Analysis - Blame It On The Contractor, the news is out that after a hacker diverted traffic from Marin County California's website to a porno site. A federal government employee accidentally deleted the ca.gov domain when trying to fix the problem and it took 7 hours to recover from the problem.

And when I say the ca.gov domain, I mean the entire *.ca.gov domain, that means all California Government related websites were down from your local DMV to governors Schwarzenegger's office for 7 full hours.

I really have to wonder what those security procedures are, after this latest gaff. Isn't the first rule of administrating anything in I.T., before making a change, make a backup copy in case you screw up whatever your changing. And the second rule, know how to restore from your backup.

The employee evidently did not know one of those rules since the reason it took 7 hours to recover, because it happened at the end of a work day, it took that long to find someone who knew how to recover from the error.

Last week, in response to news of Chinese hackers had broken into Defense Department computers, I said:

The U.S. Government needs to wakeup the severity of the security problems we are currently having and be made aware just how terrible they could get. Then government needs to get serious in mandating information security protocols on sensitive material both public and private.

Until they do, I know I'm going to sleep a little less secure at night, how about you?
How much worse to these accidents need to get, than taking down all the governments websites of the countries largest state, and no, I'm still not sleeping very well.

For more on the latest security gaff, check out Network Worlds: Federal 'fix' knocks ca.gov for a loop, Download Squads:Federal government knocks California state websites offline and Tech Dirt:Feds Accidentally Turn Off California Gov't Websites.

Sphere: Related Content

Wednesday, October 3, 2007

Microsoft .Net Framework Opened For Inspection

Its almost like Microsoft is opening up the .Net Framework. Under its Reference License, Developers will will be able to view .Net source, just not change any of it. InfoWorld:Microsoft opening up .Net Framework libraries reports:

Opening up to developers, Microsoft is releasing its .Net Framework libraries under the Microsoft Reference License, which allows viewing of source code but not modification or redistribution, the company said on Wednesday.

The release gives developers the opportunity to better understand the inner workings of the framework's source code, Microsoft said. Microsoft's efforts fall under the company's Shared Source initiative, which allows for sharing of source code; Shared Source has been viewed as Microsoft's answer to open source, in which users can view selected source code.

Also, Microsoft will introduce a capability in the upcoming Visual Studio 2008 developer tools package to allow .Net Framework developers to debug into .Net Framework source code.

"One of the things my team has been working to enable has been the ability for .Net developers to download and browse the source code of the .Net Framework libraries and to easily enable debugging support in them," said Microsoft's Scott Guthrie, general manager in the Microsoft Developer Division, in his blog.

"Today I'm excited to announce that we'll be providing this with the .Net 3.5 and Visual Studio 2008 release later this year," he said.

Here are libraries that will be opened up for viewing:

* Net Base Class Libraries (including System, System.IO, System.Collections, System.Configuration, System.Threading, System.Net, System.Security, System.Runtime, and System.Text).
* ASP.Net (System.Web).
* Windows Forms (System.Windows.Forms).
* ADO.NET (System.Data).
* XML (System.Xml).
* Windows Presentation Foundation (System.Windows).

So if you've ever wondered what those "Geniuses" at Microsoft code like, you'll soon be able find out. You think they really stick to those standards they advocate?

More information on the Microsoft Reference License can be found here.

Sphere: Related Content