Steganography meets VoIP in hacker world

Researchers and hackers are developing tools to execute a new data-leak threat: sneaking proprietary information out of networks by hiding it within VoIP traffic. (A brief history of steganography) Techniques that fall under the category of VoIP steganography have been discussed in academic circles for a few years, but now more chatter is coming from the hacker community about creating easy-to-use tools, says Chet Hosmer, co-founder and Chief Scientist at WetStone Technologies, which researches cybercrime technology and trains security professionals investigating cybercrimes. "There are no mass-market programs yet, but it's on our radar, and we are concerned about it given the ubiquitous nature of VoIP," he says. Steganography in general is hiding messages so no one even suspects they are there, and when done digitally, it calls for hiding messages within apparently legitimate traffic. VoIP steganography conceals secret messages within VoIP streams without severely degrading the quality of calls. For example, secret data can be transferred within .jpg files by using the least significant bits to carry it.

There are more than 1,000 steganographic programs available for download online that can place secret data within image, sound and text files, Hosmer says, and then extract it. Because only the least significant bits are used, the hidden messages have little impact on the appearance of the images the files contain. There are none for VoIP steganography yet, but in the labs, researchers have come up with three basic ways to carry it out. The second is hiding data inside each voice payload packet but not so much that it degrades the quality of the sound. The first calls for using unused bits within UDP or RTP protocols – both used for VoIP - for carrying the secret message. The third method calls for inserting extra and deliberately malformed packets within the VoIP flow.

A variation calls for dropping in packets that are so out of sequence that the receiving device drops them. They will be dropped by the receiving phone, but can be picked up by other devices on the network that have access to the entire VoIP stream. These techniques require compromised devices or conspirators on both ends of calls or a man-in-the-middle to inject extra packets. "It's much more difficult to do and much more difficult to detect," than hiding data within other files, Hosmer says. For example, x86 executables can carry secret messages, according to Christian Collberg, an associate professor of computer science at the University of Arizona and co-author of the book Surreptitious Software. The medium used to carry secret messages is called the carrier, and just about anything can be a carrier. By manipulating the compiler, it can be made to choose one addition operation over another, and that choice can represent a bit in the secret message, Collberg says. "There are lots of choices a compiler makes, and whenever you have a choice, that could represent a bit of information," he says.

One of the newest methods takes advantage of TCP retransmission – known as retransmission steganograpny (RSTEG) - in which sending machines resend packets for which they fail to receive acknowledgements. Even something as broadly used as TCP/IP can be host to steganographic messages. The sending and receiving machines must both be in on the steganography, according to a paper written by a group of Polish researchers headed up by Wojciech Mazurczynk at the Warsaw University of Technology. The resent packet is actually different from the initial packet and contains a steganographic message as the payload. At some point during the transmission of a file, the receiving machine fails to send an acknowledgement for a packet and it is resent.

The receiving machine can distinguish such resent packets and opens up the message, the researchers say. In general, defending against steganography is tough to do because traditional security devices such as firewalls and application firewalls don't detect this type of illicit transfer; a file containing a secret message looks just like a legitimate file. In his blog Crypto-Gram Newsletter, security expert Bruce Schneier dismisses the threat from RSTEG. "I don't think these sorts of things have any large-scale applications," he says, "but they are clever." Mazurczynk and his colleagues have spent a lot of time figuring out new carriers for secret messages, publishing research on embedding them in VoIP and wireless LAN traffic. The best way to combat suspected use of steganography to leak corporate data is to look for the telltale signs - known steganography programs on company computers, says Hosmer. When the steganography program is known, it can be applied to the carrier to reveal the secret message.

On systems where it is found, forensic analysis may reveal files that contained messages and an indication of what data might have been leaked. That message may be in code and have to be decrypted, he says. They can confront the person and take steps to prevent further leaks, Collberg says. In many cases, just knowing that steganography is going on and who is responsible is enough for a business. But businesses can take more active steps such as destroying the secret messages by altering the carrier file. Free programs such as Stirmark for scrambling files enough to destroy steganographic messages are available online.

For instance, if the carrier is an image file, setting all the least significant bits to zero would destroy any messages contained there without significantly changing the appearance of the image, he says. Keith Bertolino, founder of digital forensics start-up E.R. Forensics, based in West Nyack, N.Y., has developed double stegging – inserting stenographic messages within files with the intent of disrupting other stenographic messages that might also be in the files. According to Hosmer, a look at evidence in closed cases of electronic crime found that in 3% of those cases, criminals had steganographic programs installed on their computers. "The fact that these criminals were even aware [of steganography] was a startling surprise to law enforcement agencies," he says. He is waiting to find out if he gets a Small Business Innovation and research (SBIR) grant from the government to pursue turning his steganography jamming technology into a commercial product. Interest in steganography is growing, according to Wetstone Technology's monitoring of six popular steganography applications. That's not a dramatic increase given that the use of Internet-connected computes has gone up in the meantime, but it is still noteworthy, he says.

In 2008, the six combined logged 30,000 downloads per month, up from 8,000 to 10,000 per month about three years ago, Hosmer says. Steganography is not always bad. The watermark is a secret message embedded, for instance, in an image file so if the image is use online, a Web crawler can find it. Technically, steganography is just the same as digital watermarking, but with different intent, Collberg says. Then the creator of the image can check whether the site displaying the image has paid for it or is violating copyright, he says.

Exchange 2010 hits RTM

Microsoft Thursday concluded development on Exchange 2010 and said the new mail server would ship on Nov. 9 at the company's TechEd Conference in Berlin, Germany. In addition, the server is being touted as a hybrid - equally at home as the foundation for a hosted e-mail service or a corporate messaging infrastructure. Exchange alternatives Microsoft Exchange 2010 holds challenges, rewards for IT executives Exchange 2010, which is a 64-bit only server, includes new storage and deployment options, enhanced in-box management capabilities, built-in e-mail archiving, new database clustering, additional hardware options, and a revamped Outlook Web Access client.

The hosted version of Exchange 2010, however, is not expected to ship until May or June 2010. Microsoft already hosts more than 5 million users on Exchange 2010 as part of its Live@Edu program. The company said that the ability to use Exchange as a hosting platform is now built into the product. And end-users are already planning corporate rollouts, including Ford Motor Co. with plans to deploy 100,000 seats.  "Our senior leadership team has signed off on the final code, and it has been sent to our early adopters for one final look before its public release," read a blog post signed by "The Exchange Team". Microsoft has said previously that it has specially architected Exchange 2010 for high-availability and cross-domain integration using techniques such as pairing the server with Windows Server 2008 clustering technology and directory federation features. Lee Dumas, the director of architecture for Azaleos, a provider of remote management services for Exchange and SharePoint, says 2010 has challenges and rewards. "I'm not slamming Exchange, but to achieve the level of [service-level agreements], and dealing with large amounts of data, multiple copies of databases, server roles, and load balancing makes complexity inherent in getting the whole system in place," he says. Network World Lab Alliance member Joel Snyder said in his Exchange 2010 review that corporate users should carefully assess the implications of the new server. "The combination of clustering, replication and low-cost disk support means that reliability and scalability can be based on replicating small, inexpensive servers both within a data center and between data centers.

The rewards, however, will follow for those that heed due diligence, he says. E-mail managers thinking of deploying Exchange 2010 should step back and evaluate closely these new grid-style architectural approaches - and be sure that your Exchange team has adequate time to re-think and re-evaluate commonly held beliefs on how to build large Exchange networks." Exchange 2010 is the first in a wave of new Office products set to ship this year and next. Office 2010, SharePoint Server 2010, Office Communications Manager 2010, Visio 2010 and Project 2010 are slated to ship in the first half of 2010. Follow John on Twitter.

Detailing contingency planning

On Oct. 27, 2009, the National Institute of Standards and Technology (NIST) Information Technology Laboratory (ITL) Computer Security Division (CSD) published Special Publication (SP) 800-34 Revision (Rev) 1, "DRAFT Contingency Planning Guide for Federal Information Systems" and requested comments from readers by Jan. 6, 2010. The official announcement described the SP as follows: SP 800-34 Revision 1 is intended to help organizations by providing instructions, recommendations, and considerations for federal information system contingency planning. The guide defines a seven-step contingency planning process that an organization may apply to develop and maintain a viable contingency planning program for their information systems. Contingency planning refers to interim measures to recover information system services after a disruption. The guide also presents three sample formats for developing an information system contingency plan based on low, moderate, or high impact level, as defined by Federal Information Processing Standard (FIPS) 199, Standards for Security Categorization of Federal Information and Information Systems.

Authors Marianne Swanson, Pauline Bowen, Amy Wohl Phillips, Dean Gallup, and David Lynes include two of the six authors of the June 2002 original version of SP 800-34 (Swanson, Wohl, Lucinda Pope, Tim Grance, Joan Hash and Ray Thomas) and have, as usual for NIST ITL CSD, done a superb job of preparing a framework that lays out a sound basis for business continuity planning (BCP). The 150-page SP begins with an introduction presenting the purpose, scope and audience for 800-34 Rev 1. Page 13 of the PDF file describes the purpose as providing "guidelines to individuals responsible for preparing and maintaining information system contingency plans (ISCP). The document discusses essential contingency plan elements and processes, highlights specific considerations and concerns associated with contingency planning for various types of information system platforms, and provides examples to assist readers in developing their own ISCPs." This document explicitly excludes discussion of disaster recovery. Despite the inclusion of "for Federal Information Systems" in the title, SP 800-34 Rev 1 has a great deal of value for all information assurance and business continuity specialists. The scope is defined as "recommended guidelines for federal organizations"(p 14) and the audience is "managers within federal organizations and those individuals responsible for information systems or security at system and operational levels. Indeed, the authors write, "The concepts presented in this document are specific to government systems, but may be used by private and commercial organizations, including contractor systems." They then list a wide range of specific job titles of people likely to find the document useful, including IT managers, CIOs, systems engineers, and system architects. It is also written to assist emergency management personnel who coordinate facility-level contingencies with supporting information system contingency planning activities."(p 15) However, references to Federal Information Processing Standards (FIPS) in no way prevents the guidelines from serving organizations outside the U.S. federal government. The authors describe the structure of the document clearly as follows (p16): • Section 2, Background, provides background information about contingency planning, including the purpose of various security and emergency management-related plans, their relationships to ISCPs, and how the plans are integrated into an organization's overall resilience strategy by implementing the six steps of the Risk Management Framework (RMF)…. • Section 3, Information System Contingency Planning Process, details the fundamental planning principles necessary for developing an effective contingency capability.

This section presents contingency planning guidelines for all elements of the planning cycle, including business impact analysis, alternate site selection, and recovery strategies. The principles outlined in this section are applicable to all information systems. The section also discusses the development of contingency plan teams and the roles and responsibilities commonly assigned to personnel during plan activation. • Section 4, Information System Contingency Plan Development, breaks down the activities necessary to document the contingency strategy and develop the ISCP. Maintaining, testing, training, and exercising the contingency plan are also discussed in this section. • Section 5, Technical Contingency Planning Considerations, describes contingency planning concerns specific to the information systems listed in Section 1.3, Scope. The nine appendices provide practical templates and checklists of great utility in BCP. There is so much valuable information here that is offered in a structured, clear presentation that every IA professional concerned with BCP should read – and, I hope, comment on – this draft publication. This section helps contingency planners identify, select, and implement the appropriate technical contingency measures for their given systems.

Verizon updates Droid software; Users hope it fixes echo problem

An over-the-air software update to the Droid smartphone started yesterday, but it wasn't clear whether the 14 enhancements address a voice echo problem that hundreds of users complained about in online forums. The enhancements come from Verizon Wireless, Motorola and Google, which is behind the Android operating system that runs on the Motorola Droid. The much-anticipated update went to a "small percentage of handsets" yesterday and the update, identified as ESD56, will be phased in over the next week or so, a Verizon Wireless spokeswoman confirmed early today via e-mail. An update to the Droid Eris smartphone from HTC is "planned but a date has not yet been confirmed," the spokeswoman added.

However, it remains unclear whether the list of official fixes offers any relief to hundreds of customers who have complained of a voice echo heard by recipients of calls made from Droid phones. The Motorola Droid update is based on Google's release of a software developer kit for Android 2.0.2 on Dec. 6. The most noticeable modifications improve the Droid's camera autofocus capability and the phone's voice reception, the spokeswoman added. At least 300 comments at a Motorola online support forum refer to the subject, " Droid phone sound quality is not great ," and most comments refer to audio echo problems noticed by people whom Droid users are calling. Despite the many online complaints of a similar problem from Droid users, he couldn't get Verizon store officials to listen to him, he said. "Each time I returned to the store, now three times, I have been treated increasingly like an Android from out of space until [a recent] Friday when I threw a nutty in the store and screamed out for attention," he wrote. "The techs were clueless." Davis said his son, an engineer at Cisco Systems Inc., helped him decrease the echo somewhat by adjusting the phone's settings so that when the echo shows up, Davis must fidget with the speaker button to lessen the echo. One Motorola Droid user, John Davis, said he has enjoyed all aspects of his Droid except for the phone itself. "Almost from day one there has been an annoying echo primarily with the person on the receiving end," he wrote in an e-mail to Computerworld . Davis, a physician, bought his phone the first day it was available at a Verizon store near Boston.

But Davis was still awaiting the update, which was rumored to start on Dec. 11, but now appears to have started four days earlier. However, the official update documentation says only that one of the 14 improvements is listed as "audio for incoming calls is improved." A separate improvement says that Bluetooth functions are improved with "background echo ... eliminated" but only in reference to Bluetooth usage. Davis said his son believes the update is designed to address the issue, and so do many on an online forum. The full list also includes improvements to OS stability, battery life and camera auto focus. Ironically, many reviewers of the Motorola Droid found it has superlative sound quality , so the echo problem could be a function of networks as well as the Droid, many forum users have noted. Davis said he had no significant problems with his camera, but is still eager to have the update for the camera focus.

A Motorola support forums manager, identified online only as Matt, called attention to the update yesterday with a link to the separate Motorola forum on sound quality, implying that the improvements could help the echo problem. Verizon has noted that to get the free update, the Droid device needs to have 40% or more power available if it's not connected to an external power source and 20% power available to it if connected to a power source. The Verizon spokeswoman did not answer directly whether the updates fix the echo problem, saying only that descriptions of the audio problem on forums are "subjective," but she offered to provide a fuller explanation later.

Google Search Page Gets a New Look

Google has introduced a new version of the search engine's home page, which features a sleek fade-in effect that hides all the elements of the page except the logo, search bar, and the buttons. The rest of the elements of the page, such as links to Gmail, Documents, News, Maps, Shopping, etc., will be revealed with a fancy fade-in effect when you fist move the cursor on the screen. When accessing the main Google search page, you will only see the Google logo (or the doodle of the day) and the super-sized search bar (introduced a few months ago) with the search buttons underneath.

Google's new search homepage is now even less crowded, in comparison to Bing, the competing search engine from Microsoft, which overlays different images under the search bar daily and features search queries of interest. The search company says it tried about ten versions of the fading homepage and chose the current one based on "user happiness metrics". Some of the earlier versions of the fade-in Google homepage had an even more minimalistic approach, with the search buttons hidden at first. The fading Google homepage was first noticed a few months ago, when Google was experimenting with different designs. The final version of the fading homepage is now being introduced to Google home pages around the world. Google also introduced a better format for image search results earlier this week. Google explains in a blog post that it was concerned with the time to first action on the new homepage, which could confuse users initially. "We want users to notice this change... and it does take time to notice something (though in this case, only milliseconds!). "Our goal then became to understand whether or not over time the users began to use the homepage even more efficiently than the control group and, sure enough, that was the trend we observed," the Google team explained.

The new image search layout will show a larger image and additional smaller images alongside. In a previous update in November, Google also introduced Image Swirl, which bring layers of similar images into searches.

HP to focus on services with new print division

Hewlett-Packard on Monday formed a new print services division with a focus on managing print and imaging hardware and software in enterprises. The unit will also provide services and software that put scanned or printed documents in workflow systems to make document management easier. The division, called Managed Enterprise Solutions, aims to unify disparate hardware such as copiers, printers and scanners in order to cut hardware and printing costs, said Vyomesh Joshi, executive vice president of HP's imaging and printing group.

The company's attention has been geared toward hardware and supplies, but software and services surrounding printing and imaging are a growing opportunity, Joshi said. There is more to printing than just hitting the print button, said Roger Douglas, director of managed print services at HP. For example, software provided with the managed services could enable an invoice to be scanned, which can automatically be put into a company's payroll system. The company sees a US$121 billion annual opportunity in the printing market, of which $64 billion is for hardware and $57 billion for software and services. The automation reduces the number of steps and cost required to manage the document, Douglas said. The documents can also be secured through a service by establishing a status to ensure documents aren't appended, Douglas said.

It also reduces the chance for error through manual transcription. For example, if a marketing logo is finalized on a particular document, its status can be appended to ensure no one changes it. The company is also changing printer designs to build in more services-related functionality. This approach is particularly helpful when editing legal documents, he said. For example, a touch screen on multifunction printers can be used to input or check the job status of scanned documents like patient records. "A lot of times customers have treated imaging and printing like an afterthought," Douglas said. The company has also expanded the availability of a program that guarantees savings for customers who sign up for its print services outside the U.S. Under the plan, HP assesses a company's imaging and printing environment and calculates the possible savings a company can realize using HP's managed services.

Managed print is all about stepping back and taking a more strategic and methodical look at how those documents are managed, he said. If customers haven't realized the savings in a year, HP will make up the difference with a credit that can be used for their next printing services contract. The unit will be a part of the company's imaging and printing division, Joshi said. The company has already signed up 100 customers since it launched that program, Joshi said. The company has pulled some personnel from the existing services division and has seen its services customer base expand since acquiring EDS. HP has a strong presence in the printer market, and the expansion of services could help the company capture a larger share in the printer space, said Edward Crowley, CEO of Photizo Group, who was at HP's press briefing Monday.

The increased level of focus on services could also benefit HP's enterprise customers, he said.

States scramble to track federal stimulus bucks

There's no such thing as a free lunch, especially for IT. Go to the state of Iowa's Web site, and you can see that of the $2.5 billion in federal economic stimulus money earmarked for the state under the American Recovery and Reinvestment Act of 2009 (ARRA), $553 million has already been spent on health, education, infrastructure and other programs designed to create jobs and jump-start the local economy. Iowa CIO John Gillespie figures his IT organization has devoted about 800 man-hours so far to making that data available to the state's citizenry. "We actually had to build the application to give [different state agencies and programs] a way to submit data to us," he says. Drill a bit deeper into the data and you can pull up the exact amounts spent on weatherization training and technical programs, rental assistance programs and hundreds of other individual projects.

But the far bigger challenge, Gillespie says, has been building the business rules and defining internal processes to comply with federal reporting requirements, which have changed or been updated several times since the stimulus package was first announced in February. Just as the $787 billion ARRA is unprecedented, so are the reporting demands it's making on state CIOs and IT organizations, which are scrambling to whip up new processes and tools to accurately track and account for their states' shares of the stimulus pie. States are required to file quarterly reports that fully account for every tax dollar spent. The process has been complicated by a variety of factors, including exceedingly tight deadlines and complex and changing federal reporting guidelines. Like so many of the energy and construction projects launched with stimulus dollars, tracking and reporting systems remain works in progress.

The best state Web sites for ARRA tracking States with the best ARRA-tracking Web sites, as of July: 1. Maryland (see related story) 2. Colorado 3. Washington 4. West Virginia 5. New York 6. Pennsylvania - Mitch Betts Source: study by Good Jobs First (PDF), Washington, July 2009 Another big problem is the lack of a central accounting system in most states, which have had to first devise ways of extracting and aggregating data from multiple systems across hundreds of agencies before rolling it up to report it to the federal government. In Missouri, one of a handful of states to have a central accounting system used across all state agencies, funding and budgeting data is relatively easy to access. Two data points the feds want to track are job creation and retention under the economic stimulus program. "But the definition and requirements for how to count jobs is quite a challenge to understand," according to Marilyn Gerard-Hartman, director of enterprise applications for Missouri. What remains difficult to grasp, however, is precisely what the federal government wants to know, says CIO Bill Bryan. For example, if the state awards a highway infrastructure project to a contractor who in turn hires a subcontractor, who in turn hires other subcontractors, "how far down the chain is the state responsible for tracking? Meanwhile, fulfilling the requirements to the letter of the federal law is critical, Bryan notes. "If you don't comply, you could get thrown under the bus and not get any further funding." "One of the biggest challenges is just the speed at which we had to get things done," says Iowa's Gillespie. "The rules for the most part didn't get finalized literally until weeks - not months - ago.

And do you only count it as a job created if the job wouldn't have existed without the ARRA funding?" Generally, "it hasn't been clear what the requirements are until fairly late in the game," adds Bryan. Just keeping up has been the biggest challenge." Rather than licensing commercial stimulus-tracking tools, Gillespie's team internally developed a tracking and reporting system "using tools already familiar to financial folks who have all the data in Excel spreadsheets," he explains. First things first But before IT could build the tracking system, "we actually had to build a Web-based application to give people a way to submit data to us," Gillespie explains. The data is imported into a database, where it is aggregated, extracted and converted to an XML-formatted report and submitted online to the federal government. Stimulus reporting: The basics More than two-dozen federal agencies have been allocated a portion of the $787 billion in stimulus money.

The federal agencies are required to file weekly financial reports on how they're spending the money and their specific activities involving ARRA funds. Each federal agency develops specific plans for its share, then awards grants and contracts to state governments or, in some cases, directly to schools, hospitals, contractors or other organizations. Last month, states and other grant recipients for the first time filed the quarterly spending reports required under the law. As Nabors puts it: "Between OMB and the vice president's office and others in the White House and out in the agencies, we've done 169 different conference calls with recipients. The executive branch has worked hard to ensure a smooth reporting operation, according to Rob Nabors, director of the federal Office of Management and Budget.

There've been 170 events with state officials. There have been seven White House forums, and there've been 20 separate Recovery Act reporting training sessions. We've had 37 different events with local government officials. That by itself is an unprecedented effort by the federal government to make sure that we get it right, and this was something that started all the way back in February." - Julia King All of this was done in a matter of weeks by a small team comprising a designer-architect, a programmer and a project manager who is the chief liaison between IT and the state's stimulus office. But given the fierce push to quickly distribute ARRA funds and get new projects up and running, traditional IT project management practices, such as having a comprehensive set of user requirements, have in some cases gone out the window. The team already had some experience from working on the state's recovery Web site, which Gillespie says has "been kind of an iterative process that has been going on since the first recovery money came out." He chalks up the speedy rate of progress on both projects to what he describes as healthy competition among states to have good reporting and great Web sites. "We wanted to be better than everybody else," he says.

In Missouri, for example, a team was in the midst of implementing Microsoft Corp.'s Stimulus360 software for tracking funding and projects when the feds issued a change in data models for reporting. "We had to move forward with plans and put things in place even though you knew [more] changes were coming," says Gerard-Hartman. Tracking and reporting how many jobs are created with ARRA funds is a prime example. Like a start-up At the newly formed Massachusetts Recovery and Reinvestment Office, Deputy Director Ramesh Advani likens the fast pace and deadline-driven atmosphere and culture to the environment of a start-up company. "When you're doing something for the first time, you deal with systems issues, people issues, deadline issues, and you come across something new every day," he says. Meeting the first federal reporting deadline of Oct. 10 required some on-the-fly tactics. "For the first round of reporting, what we have done is develop some manual templates, which we issued out to state agencies. That data will then be gathered and uploaded into a recently launched central database, then uploaded through XML to the federal system," he explains.

Each agency must use the same template and also pass it on to subrecipients and vendors. But this is only for the first round of reporting, Advani emphasizes. We want to make sure we can use the same tools and reporting database beyond ARRA," Advani says. For the long term, the state is developing an automated data-gathering and -analysis system that includes Oracle Corp. business intelligence tools, data marts and data warehousing. "We're trying not to make this a short-term solution, because ultimately we want to upgrade how we do grants management and our budgets across the board. ARRA timeline: 2009 * Feb. 17: The American Recovery and Reinvestment Act is signed into law. For example, funding for ARRA transportation and highway projects had a 120-day "use it or lose it" deadline.

The federal government's Recovery.gov Web site goes live. * Feb. 19: Federal agencies begin announcing block grant awards. * May 17: Agency and program plans are posted on Recovery.gov. * May - October: ARRA stimulus money is distributed to states and other recipients. * May - August: Reporting requirements and updates are developed and distributed. * Sept. 28: Recovery.gov is relaunched with geographic mapping. * Oct. 10: The first quarterly deadline for recipient reporting is reached. * Oct. 30: Recipient grant and loan data is posted. - Julia King ARRA's tight timeline and strict reporting deadlines have already driven some key process improvements in the state. But it typically took the state between 100 and 300 days to advertise projects and solicit bids from contractors. "We ended up drilling down into the system to get it down to a 40-day process, which is something we're proud about," Advani says. Going forward, the office will oversee all activities involving grants reporting, monitoring and compliance. Another long-term improvement is the creation of the program office itself. The goal is to increase overall information transparency "beyond what's expected for federal reporting," says Advani.

Before ARRA, for example, the state didn't provide electronic versions of contracts on its Web site - which is a requirement under ARRA. "Now, we'll take that technology and process new contracts through the same system so we can provide broader information on all contracts that have been awarded," he says. Maine CIO Dick Thompson says he's already been directed by a legislative committee to ensure that IT work done to meet ARRA reporting requirements is also used to increase information transparency statewide. The result of ARRA reporting, CIOs agree, is a lot like the road signs popping up that say: "Temporary Inconvenience, Permanent Improvement."

Network-based e-mail – Ready for prime time?

In the prior newsletter, we raised the question of whether the time is here – or past due – for moving e-mail from local PCs back to the network. Security: Of course, this is always the first question for any public e-mail services (such as Amazon). Is your "private" e-mail really private? This time we want to continue the discussion by looking at some of the key questions that need to be addressed.

In truth, our answer is "probably not." However, anything that has ever transited the Internet is likewise probably not truly private. Data security: Yet another way of looking at "security." How difficult would it be for someone to hack into your public cloud-based e-mail? In reality, any illusion of true security is probably just that – an illusion. Given enough time and enough tries at a given account, the answer would be "not very." However, just to put this into perspective, what is the relative risk of someone hacking an online account vs. having a notebook computer (containing the same information) lost or stolen? For the SMB, the public services probably are quite appropriate.

Private or public cloud: This is a tough one, and a lot depends on scale. For larger shops, it's a more complex call. This alleviates the necessity of having local servers, maintaining these servers, backing up on a regular schedule… This is basically the same as any other cloud application. That said, we've seen numerous shops totally "outsourcing" e-mail to services like Google. Storage availability: One of the major reasons years ago for moving to a PC-based service was that network storage was limited and expensive. Now, however, even the free version of Gmail offers individuals over 7GB of storage (with a constantly incrementing counter.) And additional storage is available at an "almost free" price.

Now, however, that's no longer a stumbling block. Compliance: A great question. However, our initial take is that compliance with various regulations can be handled once by the cloud-based organization and then applied for multiple customers. And one that we'll be looking for your input on. More on this to come.

For now, the services are looking "good." And we're eagerly awaiting checking out "Google Wave" as a look at the next generation. Integrated interfaces and collaboration: Clearly, this is an area where we'll be seeing significant interest in the near future. Personalization: Right. No problem. You don't want to have your corporate image as [fill-in-the-blank]@gmail.com or [fill-in-the-blank]@yahoo.com. If it's a private cloud, then you still have your own servers.

And this is only a starting point for this issue. And even if it's a public cloud, it's trivial to personalize with your own domain name. For a continuation, we invite you to join our discussion on this topic at TECHNOtorials. Com.

Microsoft correctly predicts reliable exploits just 27% of the time

Microsoft's monthly predictions about whether hackers will create reliable exploit code for its bugs were right only about a quarter of the time in the first half of 2009, the company acknowledged Monday. "That's not as good as a coin toss," said Andrew Storms, director of security operations at nCircle Network Security. "So what's the point?" In October 2008, Microsoft added an "Exploitability Index" to the security bulletins it issues each month. The idea was to give customers more information to decide which vulnerabilities should be patched first. The index rates bugs on a scale from 1 to 3, with 1 indicating that consistently-successful exploit code was likely in the next 30 days, and 3 meaning that working exploit code was unlikely during that same period. Before the introduction of the index, Microsoft only offered impact ratings - "critical," "important," "moderate" and "low" - as an aid for users puzzled by which flaws should be fixed immediately and which could be set aside for the moment.

Microsoft also tallied its predictions by security bulletins - in many cases a single bulletin included patches for multiple vulnerabilities - to come up with a better batting average. "Sixteen bulletins received a severity rating of Critical," it said in its report. "Of these, 11 were assigned an Exploitability Index rating of 1. Five of these 11 bulletins addressed vulnerabilities that were publicly exploited within 30 days, for an aggregate false positive rate of 55%." The company defended its poor showing - even on a bulletin-by-bulletin level it accurately predicted exploitability only 45% of the time - by saying it was playing it safe. "The higher false positive rate for Critical security bulletins can be attributed to the conservative approach used during the assessment process to ensure the highest degree of customer protection for the most severe class of issues," said Microsoft. "There's some validity to that," agreed Storms. "They're going to err on the side of caution, if only to prevent people saying 'I told you so' if an exploit appears later." John Pescatore, Gartner's primary security analyst, agreed, but added, "If they want to stick with the index, they need to adjust the criteria so fewer vulnerabilities get a '1.'" With vulnerability-by-vulnerability predictions correct only a fourth of the time, Storms questioned the usefulness of the exploitability index. "What's the point of the index if they're always going to side on the more risky side, as opposed to what's most likely?" he asked. "In some ways, we're back to where we were before they introduced the exploitability index." From Storms' point of view, the exploitability index was meant to provide more granular information to customers who wondered what should be patched first. But in the first half of this year, Microsoft correctly predicted exploits just slightly more than one out of every four times. "Forty-one vulnerabilities were assigned an Exploitability Index rating of 1, meaning that they were considered the most likely to be exploited within 30 days of the associated security bulletin's release," Microsoft stated in its bi-annual security intelligence report , which it published Monday. "Of these, 11 were, in fact, exploited within 30 days." That means Microsoft got it right about 27% of the time. Presumably, a vulnerability marked critical with an index rating of "1" would take precedence over a critical vulnerability tagged as "2" or "3" on the exploitability index. "With these numbers of false positives, we are in no better place than we were prior to the index, in respect to granularity," he said. Instead, Pescatore again argued, as he did last year when Microsoft debuted the index, that the company would better serve customers by abandoning its own severity and exploitability rankings, and move to the standard CVSS [Common Vulnerability Scoring System] ratings. Pescatore also questioned the usefulness of the exploitability index. "I doubt anyone even looks at it," he said.

The CVSS system is used by, among other companies and organizations, Oracle, Cisco and US-CERT. "Because Microsoft does its own exploitability index, enterprises can't compare theirs with Adobe's or Oracle's. It's an apples and oranges thing then," said Pescatore. "It's not just Windows bugs that companies have to deal with anymore." He doubted Microsoft would take his advice. "They don't want to do that because then reporters and analysts can look and say, 'Microsoft has more higher-rated vulnerabilities than Oracle or Adobe,'" he said. "There's nothing in it for them to do that." Microsoft made the right call on all 46 vulnerabilities that were assigned an exploitability rating of "2" or "3," which indicate that an exploit would be unreliable or unlikely, respectively. "None were identified to have been publicly exploited within 30 days," Microsoft's report noted. Microsoft's security intelligence report, which covers the January-June 2009 period, was the first to spell out the accuracy of the exploitability index. If all its predictions in the first half of 2009 are considered, not just those marked as likely to be exploited, Microsoft got 57 out of a possible 87, or 66% of them, right. But Microsoft has touted its forecasting before. Microsoft's security intelligence report can be downloaded from its Web site in PDF or XPS document formats.

A year ago, for example, Microsoft said in a postmortem of its first-ever index that although it had accurately predicted exploits less than half the time, it considered the tool a success . "I think we did really well," said Mike Reavey, group manager at the Microsoft Security Research Center (MSRC), at the time.

Cloudera intros Hadoop management tools

Startup Cloudera is introducing a set of applications on Friday for working with Hadoop, the open-source framework for large-scale data processing and analysis. It allows an application workload to be spread over clusters of commodity hardware, and also includes a distributed file system. Cloudera, which provides Hadoop support to enterprises, developed the new browser-based application suite to simplify the process of using Hadoop, according to CEO Mike Olson. "It's an easy-to-use GUI suitable for people who don't have a lot of Hadoop expertise," Olson said. "The big Web properties with sophisticated and talented PhDs have been successful [with it], but ordinary IT shops ... have had a harder time." Hadoop is known for its behind-the-scenes role crunching oceans of information for Web operations like Facebook and Yahoo. But although the technology is "at its best" when data volumes get into multiple terabytes, Hadoop has relevance for a wide variety of companies, according to Olson. "It's increasingly easy to get your hands on that much data these days," especially from machine-generated information like Web logs, he said.

Cloudera and its partners are fine-tuning the suite, which is now in beta, before issuing a general release. The browser-based application set is supported on Windows, Mac and Linux, and includes four modules: a file browser; a tool for creating, executing and archiving jobs; a tool for monitoring the status of jobs; and a "cluster health dashboard" for keeping tabs on a cluster's performance. Hadoop needs many more tools like it, according to analyst Curt Monash of Monash Research. "If Hadoop is to consistently handle workloads as diverse and demanding as those of [massively parallel processing] relational DBMSes, it needs a lot of tools and infrastructure," Monash said via e-mail. "The three leaders in developing those are Yahoo, Cloudera, and Facebook. There's a long way to go."

Nasty banking Trojan makes mules of victims

A sophisticated Trojan horse program designed to empty bank accounts has a new trick up its sleeve: It lies to investigators about where the money is going. It rewrites bank pages so that the victims don't know that their accounts have been emptied, and it also has a sophisticated command-and-control interface that lets the bad guys pre-set what percentage of the account balance they want to clear out. First uncovered by Finjan Software last week, the URLzone Trojan is already known to be very advanced. But Finjan isn't the only company looking into URLzone.

Researchers typically create their own programs that are designed to mimic the behavior of real Trojans. RSA Security researchers say the software uses several techniques to spot machines that are run by investigators and law enforcement. When URLzone identifies one of these, it sends it bogus information, according to Aviv Raff, RSA's FraudAction research lab manager. When URLzone spots a researcher's program, instead of simply disconnecting from the researcher's computer, the server tells it to do a money transfer. Security experts have long published research into the inner workings of malicious computer programs such as URLzone, Raff said. "Now the other side knows that they are being watched and they're acting," he said. But instead of transferring the money into one of the criminal's money mules - people who have been recruited to move cash overseas - it chooses an innocent victim.

So far, more than 400 legitimate accounts have been used in this way, RSA said. Typically, these are people who have received legitimate money transfers from other hacked computers on the network, Raff said. The idea is to confuse researchers and to prevent the criminal's real money mules from being discovered. According to Finjan, URLzone infected about 6,400 computer users last month and was clearing about €12,000 (US$17,500) per day. Banking Trojans such as Zeus and Clampi have been emptying accounts for years now, but Finjan dubbed URLzone the first of a new, smarter generation of the crimeware.

DOJ tries to step in front of IBM mainframe biz steamroller

IBM's mainframe business has over the years been somewhat of a steamroller, rarely slowing down to take a look at the little things in its way it may have crushed. NetworkWorld Extra: How to really bury a mainframe 15 genius algorithms that aren't boring Not to diminish what is a preliminary DOJ investigation but the background to this one sounds little like things are just going in a circle. Word today that the Department of Justice has begun a preliminary investigation into whether IBM has abused its monopoly in mainframe computers will likely fall under the category of things the Big Iron slowed down for but ultimately ran over anyway. To start, published reports say the Computer and Communications Industry Association, had complained to the DOJ recently about some unsavory IBM IBM's behavior.

One of those rivals was T3 Technologies, which just last week lost civil suit it had filed against IBM. According to the Financial Times, the judge said T3 did not have a case to make because IBM had licensed the technology in the past to two of the company's suppliers, rather than to T3 directly. Thus the DOJ began asking for information about the mainframe market to IBM rivals. He added that the suppliers, which had brought complaints of their own, also lacked a case, since IBM had acted within its rights to stop licensing its old software when it moved on to a new generation of technology. But T3's IBM angst runs deeper. T3 Technologies has an ongoing antitrust complaint against IBM with European regulators as well.

The company was a strong supporter of fellow IBM mainframe clone maker Platform Solutions (PSI), a privately held company with whom Big Blue had traded lawsuits with over Big Iron technology until 2008 when IBM had enough and just bought the company outright. The DOJ abandoned a 1956 antitrust consent degree against IBM in 2001, but said that IBM could be subject to a new antitrust lawsuit if it engaged in anticompetitive behavior. Long time IBM critic CCIA also weighed in at the time calling the IBM/PSI acquisition a black hole: "It sucks the life out of the market and destroys the matter. " The deal transforms a market with potential for competition into one "with little prospects for anything but complete domination by IBM," said CCIA President and CEO Ed Black. "When it comes to violations of competition law, IBM appears to be an unrepentant recidivist," he said. As in the new case, rivals say IBM has shut out other mainframe vendors by ending support for older mainframe systems and not licensing its mainframe software to rivals. Will all of this hurt IBM? Seems unlikely.

CCIA's complaint against IBM alleges that the company has refused to issue licenses for IBM's mainframe OS to competitors, as required in a 1970s antitrust consent decree with the DOJ that was terminated in 2001. In some cases, IBM has yanked the OS license from customers trying to switch from IBM mainframe hardware to a competitor's," charges Black. According to a recent study of 300 end users by researchers at IDC nearly one-half of said they plan to increase annual spending on mainframe hardware and software. The study says IBM's strategy of building specialty processors for the mainframe, such as the Integrated Facility for Linux (IFL) System z Integrated Information Processor (zIIP) for ERP and CRM transactions and z Application Assist Processor (zAAP) processors for Java and XML transactions are key to ongoing success of the platform. Many mainframe users reported that they can plan another wave of investments in the System z platform over the next 2–5 years, citing the system's high availability, reliability, and security for mission-critical applications as major drivers, IDC stated. IBM has engaged in some price cutting to make some of these processors more palatable though. IBM acknowledged "new pricing" for the IFL processors, but did not offer specific numbers.

According to a Network World article IBM has cut in half prices for some specialty Linux processors. Another source said the price changed from $90,000 to $47,500 for IFLs running on the System z Business Class mainframe. This summer IBM reported that System z mainframe server revenue decreased 39% year-over-year in the second quarter, while overall company revenue declined 13%. But IBM's mainframes haven't been immune to the economic downturn.

Netbooks propel global semiconductor sales

Chip sales are growing by the month, partly driven by growing netbook sales and falling laptop prices, market research firms said on Friday. Consumers are drawn to the lower prices of netbooks, which account for close to 17 percent of total laptop unit shipments worldwide, SIA said. "Growing sales of netbook personal computers ... have created an important new market segment, filling a gap between 'smart cell phones' and conventional laptop PCs," said George Scalise, president of SIA. Netbooks have small screens and keyboards and are designed to run basic Internet and productivity applications. Global semiconductor sales grew by 5 percent from July to August as consumers picked up more laptops, especially netbooks, said the Semiconductor Industry Association in a study.

Laptop prices have fallen by close to 14 percent compared to last year and the memory used in these devices has increased 25 percent, SIA said. Semiconductor sales also rose in July compared to June, iSuppli said on Friday. That has contributed to growing chip sales, which totaled $19.1 billion in August. The August sales growth points to a slow recovery from the steepest downturn the industry has faced in close to a decade, the research firm said. Consumer electronics - including mobile phones and laptops - accounted for a bulk of the semiconductor sales.

Semiconductor sales grew across all major regions, with the Americas recording 5.4 percent growth and sales in Asia-Pacific growing by 5.3 percent. Semiconductors like processors, and storage devices like flash memory go into devices such as PCs and cell phones. However, chip sales fell by 16 percent compared to August 2008, SIA said. Semiconductor sales to the consumer electronics sector increased by about 28 percent in the third quarter compared to the second quarter, iSuppli said in a statement. "This reveals a pattern of solid sequential growth for the first two months of the third quarter," iSuppli said. ISuppli predicted a third-quarter revenue decline of 16 percent on a year-over-year basis. Companies like Advanced Micro Devices and Dell have also said PC shipments could recover as users upgrade PCs with Microsoft's upcoming Windows 7 OS, which is due this month.

ISuppli earlier this week projected that semiconductor revenue could grow by as much as 13.8 percent in 2010. Intel CEO Paul Otellini last week said that its chip shipments were stabilizing as PC unit shipments showed signs of recovery. IDC in July reporting stronger-than-expected PC shipments for the second quarter of this year, boosted by a strong interest in laptops and lower prices. However, IDC warned that laptop shipments were affected by weak enterprise spending, which may pick up next year as companies look to upgrade to newer hardware and software. PC shipments fell by 3.1 percent, less than IDC's original projection of 6.3 percent for the second quarter. Semiconductors also include chips that go into cars.

Chip sales to the automotive sector grew by 30.2 percent, boosted by efforts like the "Cash for Clunkers" program in which discounts were offered by the U.S. government for trading in old for new vehicles, SIA said.

VMware ties disaster recovery to vSphere, lifting obstacle to adoption

VMware's Site Recovery Manager is now supporting vSphere, eliminating one of the obstacles preventing customers from upgrading to the latest version of VMware's virtualization platform. VMware on Monday released SRM version 4, with support for vSphere and other upgrades including a "many-to-one failover [that] protects multiple production sites with automated failover into a single, shared recovery site." Because Site Recovery Manager did not immediately support vSphere, numerous customers have delayed upgrades from 3.5, acknowledges Jon Bock, product marketing manager for VMware's server business unit. vSphere 4, the successor to ESX Server 3.5, was unveiled in April but until now did not work with Site Recovery Manager, VMware's software for recovering virtual machines in case of disaster.

Now that SRM supports vSphere, adoption should accelerate, he said. "vSphere was a significant change that we had to update the add-on products for. But the months-long delay is similar to delays often seen between the release of a new operating system and add-on products, he said. "A customer who has important production applications on ESX 3.5 is probably not going to upgrade to vSphere 4 the day after it's released," Bock said. In a perfect world, we'd love to have all the new releases of products released on the same day as the platform," Bock said. vSphere is still not supported by VMware View, the vendor's desktop virtualization software. Lifecycle Manager just gained compatibility with vSphere in a new release a few weeks ago. VMware View will be compatible with vSphere in its next release, expected in 2010, according to a VMware spokeswoman.

In addition to support for vSphere, Site Recovery Manager now supports NFS storage, along with Fibre Channel and iSCSI, which were already supported. "We have a lot of interest in NFS from customers looking at using that in important applications," Bock says. VMware provides an integration module to partners, and most of the major storage companies have made their products compatible with Site Recovery Manager. SRM works by integrating tightly with storage array-based replication. Shared recovery sites, the other new feature, could be useful for companies with multiple branch offices, Bock said. The new version of SRM is available now and costs $1,750 per processor.

Overall, the new release is "focused on expanding the use cases for Site Recovery Manager," he said. SRM was first released in June 2008 and has been purchased by more than 2,000 customers, Bock says. Virtualization offers inherent advantages when it comes to disaster recovery, since it eliminates the need to recover the actual physical server an application was running on, Bock notes. That's still a small portion of VMware's 150,000 customers overall. Some customers have been using SRM not for disaster recovery but to move applications from one site to another when they are switching data centers, he said.

SRM support for vSphere was a highly anticipated feature, says ITIC analyst Laura DiDio. "Disaster recovery and backup are in every customer's top five checklist of things you must have," she says. Still, disaster recovery is the main purpose for the software. Follow Jon Brodkin on Twitter

Study shows open-source code quality improving

The overall number of defects in open-source projects is dropping, a new study by vendor Coverity has found. The vendor has set up a Web site through which open-source projects and developers can submit code to be analyzed. Coverity, maker of tools for analyzing programming code, received a contract in 2006 from the U.S. Department of Homeland Security to help boost the quality of open-source software, which is increasingly being used by government agencies. The vendor assigns projects to a series of "rungs" depending on how many defects they resolve. "Defect density" has dropped 16 percent during the past three years among the projects scanned through the site and some 11,200 defects have been eliminated, according to Coverity's latest report.

They are Samba, tor, OpenPAM and Ruby. Four projects have been granted top-level "Rung 3" status, after resolving defects discovered during Rung 1 and 2, Coverity said. The Scan site has so far analyzed more than 60 million unique lines of code from 280 projects, according to Coverity. Coverity's scanning service employs static analysis, which is used to check code for security or performance problems without having to run an application itself. More than 180 projects have developers actively working to scan open-source projects.

This is preferable because "testing every path in a complex program as it runs requires constructing a large number of special test cases or structuring the code in special ways," Coverity said. "Static analysis [tools] won't tell you that your business process is working correctly ... but they will tell you that the code itself is technically solid, and follows the kind of programming best practices you'd expect to see from code that has gone through a proper code review," said Forrester Research analyst Jeffrey Hammond via e-mail. The tools tend to be most helpful for finding "structural 'anti-patterns' in code, poor programming practices that can result in performance and security issues like memory leaks and buffer overflows as well as more exotic conditions like errors due to parallel execution of code in a multicore CPU environment," he added.

Adobe to buy Omniture for US$1.8 billion

Adobe has agreed to buy Web analytics company Omniture for US$1.8 billion in cash, the companies said Tuesday. On a conference call Thursday, executives wouldn't say if there was a bidding war with other companies to buy Omniture. The price San Jose, California-based Adobe is paying for the company, which is $21.50 per share, is at a 45 percent premium over Omniture's average closing price for the last 30 trading days, Adobe said.

Adobe, known for multimedia design, Web-development and document-creation software such as Flash, Dreamweaver and Acrobat, said the purchase will help the company add Web analytics and optimization capabilities directly to those products. He said a recent Forrester study found that 73 percent of companies doing business on the Web had some kind of analytics technology in place. "It's a ubiquitous technology that is in high demand at companies that are placing any parts of their business online," he said. This kind of ability to measure what kinds of media, Web applications or Web pages are popular with users is becoming essential as more and more business is being done on the Web, particularly in the area of online advertising, said Forrester senior analyst John Lovett. For designers, developers and online marketers using its tools, this new capability will help them streamline how they create and deliver relevant content and applications, Adobe said. On a conference call Tuesday, Adobe CEO and President Shantanu Narayen said that the idea for a merger grew out of conversations with Omniture's CEO, Josh James, and with customers who wanted more out of the digital media they were creating using Adobe's products. Advertisers, advertising agencies, publishers and online retailers can improve the experience of their end users and get more out of their digital media through the new analytical capability, the company said.

For example, Narayen said people were using Flash to create online advertisements, but wanted a way to better understand click-through rates so they could see which ones were working. Similarly, Adobe, too, found it wanted more information from the ads and digital media it was putting up on its own site. They thought there might be a way for Adobe to build that into their products, and "a number actually wanted us to integrate with solutions like Omniture," he said. Omniture had been an Adobe partner for some time, and in conversations with James, Narayen said the two realized their companies had "the same vision" for how digital media and rich Internet applications could include Web analytics and optimization technology. The deal creates a "big opportunity" to allow content creators to potentially measure the impact of everything they do, Lovett added.

Forrester's Lovett said the deal will put Adobe a step ahead of other companies creating tools for developing digital content. "The combination of these two technologies makes sense - it's the creative meeting the measurement side of things," he said. Following the close of the deal, Omniture will become a new unit within Adobe, the company said. The companies expect the deal to close in the fourth quarter of Adobe's fiscal year, which ends Nov. 27. Omniture's CEO James will join Adobe as senior vice president in charge of that business unit, reporting to Adobe President and CEO Shantanu Narayen.

Why Microsoft kept Exchange 2007 SP2 off latest Windows Server

Microsoft, reacting to a slew of questions from end-users, says timing issues and technical considerations kept it from supporting Exchange 2007 SP2 on the new Windows Server 2008 R2. 5 things we love/hate about Win7/Windows Server 2008 R2Exchange 2010 beta sneak peek test On the Exchange team blog, Nino Bilic, a member of the Exchange product quality team at Microsoft, wrote that there are two primary technical considerations for not supporting the messaging server on the new server OS. Users have been peppering Microsoft with questions over the past few months and the vendor chose Monday to explain its decision as it prepares to put the final touches on Exchange 2010, which aligns with other new infrastructure, namely Windows Server 2008 R2, Windows 7 and Office 2010.  "Two primary technical points drove our decision to not support Windows Server 2008 R2," Bilic wrote. "First, Windows Server 2008 R2, while an incremental OS upgrade, creates significant testing requirements for Exchange 2007. Because the Exchange 2007 SP2 engineering preceded the Windows Server 2008 R2 RTM, Exchange 2007 SP2 would have had to be delayed significantly to align testing schedules." Bilic said the second point involves not supporting the upgrade of a server OS underneath an existing Exchange server. "The primary need is to support Windows Server 2008 R2 domain controllers in an existing Exchange 2007 deployment, which we have done." Exchange 2007 SP2 can work against those domain controllers because no part of the Exchange infrastructure is running on the domain controller. Exchange 2010 is expected to ship in November. "We felt that thoroughly validating the combination of Exchange 2010 on Windows Server 2008 R2 allowed us to focus on delivering great solutions which would be fully tested and would support the features of Windows Server 2008 R2," Bilic wrote. "This is a hard trade-off to make, but we believe it is the right one and a good balance between serving existing customers and driving innovation." The new Exchange 2010 server includes a number of new features, including high-availability and cross-domain integration using techniques such as pairing the server with Windows Server 2008 clustering technology and directory federation features. What users are missing is the ability to run any Exchange 2007 R2 components on the new server, including administrative tools on Windows Server 2008 R2. Bilic said the level of testing that would have been required to ensure only a "minimum level" of compatibility would have been significant and still denied users many of the features of the new server OS. In addition, he said the work likely would have altered the delivery schedule for Exchange 2010. Bilic said that fact drove Microsoft to conclude the best decision was to release Exchange 2010 as close as possible to Windows Server 2008 R2, which is now available. The server also includes new archiving features. "We recognize that there are some downstream impacts to this decision related to administration-only installs," Bilic wrote. "The technical problem for us is that an administration install of Exchange is almost identical to a full Exchange server installation." An administration install is when only the administrative interface, used to manage server properties and other features, is loaded on the server OS. Lee Dumas, the director of architecture at managed Exchange service provider Azaleos and a former Microsoft employee on the Exchange team, noted that Exchange 2007 SP2 contains the schema updates that are part of Exchange 2010. "So deploying SP2 prepares you for Exchange 2010. The earlier they can release SP2 the more customers will be prepared for 2010 so that might have had something to do with this as well," he said.

Follow John on Twitter. Dumas noted that releasing planned schema updates with a previous version of Exchange is something new for Microsoft.

Group seeks answers from DHS on delay of privacy report

A privacy rights group is pressing the U.S Department of Homeland Security to disclose when it plans to release its annual privacy report to Congress. The letter also noted that Callahan is obligated by law to prepare an annual report to Congress detailing activities at the agency that have an impact on privacy. The Electronic Privacy Information Center (EPIC) on Tuesday sent a certified letter to Mary Ellen Callahan, DHS's chief privacy officer, noting that the department's last privacy report was released more than a year ago, in July 2008. "As it has been over a year since the publication of the last report, we would like to know when the current report, concerning the activities of your office, will be made available to the public," the letter states. The report also needs to detail complaints of privacy violations, implementation of the Privacy Act of 1974, and internal privacy controls within the DHS, the letter states.

Lillie Coney, EPIC's associate director, said the privacy report was "significantly tardy enough" to merit sending the letter to DHS. "We'd like to know what the agency has been doing regarding privacy," Coney said. A copy of the DHS letter was sent to the chairman and the ranking member of the U.S. House Committee on Homeland Security. EPIC needs to be sure that the DHS' privacy officer is sufficiently focused on her obligation to release the report in a timely fashion, Coney said. The DHS could not be immediately reached for comment. The annual report, which has been issued since 2003 chronicles the privacy issues that the DHS is focused on and shows whether it is fulfilling its constitutional obligations for privacy and civil liberties, Coney said. "It gives us an idea of the way the DHS has been prioritizing privacy issues and what resources it has made available" to address the issues, she said. This is not the first time EPIC has pressed DHS to release its reports in a timely fashion.

As one of the largest federal agencies, the DHS is involved in several projects that privacy groups such as EPIC keep a close eye on. The group sent a similar letter to the DHS last year after the report's release was delayed. Examples include Einstein 2.0, a network monitoring technology that improves the ability of federal agencies to detect and respond to threats, and the Real ID identity credentialing initiative . The DHS's terror watch list program, its numerous data mining projects , the secure flight initiative, the proposed use of body imaging technologies and its searches of electronic devices at U.S. borders are also all being closely followed by privacy groups.

iStockphoto guarantees its collection

Starting today, iStockphoto, the micropayment royalty-free image, video, and audio provider, will legally guarantee its entire collection from copyright, moral right, trademark, intellectual property, and rights of privacy disputes for up to $10,000. The new iStock Legal Guarantee, delivered at no cost to customers, covers the company's entire 5 million-plus collection. Recently however, Vivozoom, another microstock company, took a similar action to guarantee its collection. Additional coverage for an Extended Legal Guarantee totaling $250,000 is available for the purchase of 100 iStock credits. "Our first line of defense has always been-and continues to be-our rigorous inspection process," said Kelly Thompson, chief operating officer of iStockphoto. "The Legal Guarantee is simply an added layer of protection for our customers, many of whom are using microstock more than ever before." Although common for traditional stock houses, such legal guarantees have not been standard in microstock because of the low prices. iStock says that files purchased and used in accordance with its license will not breach any trademark, copyright, or other intellectual property rights or rights of privacy.

And, if a customer does get a claim, iStock will cover the customer's legal costs and direct damages up to a combined total of $10,000. iStock customers can increase their coverage for legal fees and direct damages up to a combined total of $250,000 by purchasing the Extended Legal Guarantee via the iStock credits (which costs between $95 and $138). iStock expects that this program will be popular with a very small percentage of sophisticated media buyers with very specific needs, and considers it to be a value-added service to customers rather than a major source of revenue.

iTunes gains Automatically Add to iTunes feature

One of the often requested features for iTunes has been the ability to set a folder for it to watch, automatically adding any items you drop in that folder to its library. In typical Apple fashion, it's not exactly what people were asking for, but Apple's interpretation of what they want. In iTunes 9, Apple has quietly added this feature, although I wouldn't blame you for not having noticed its existence.

When you install iTunes 9, it automatically creates an Automatically Add to iTunes folder in your ~/Music/iTunes/iTunes Music folder (or under ~/Music/iTunes/iTunes Media if you created a new library after installing iTunes 9). When you put an iTunes-compatible media file in this folder, it will, as the name suggests, be added to iTunes automatically. Whenever you drop any file into that folder, it's instantly added to iTunes if the application is running. In my limited testing, I've found that it pretty much works as advertised. If not, it gets added the next time iTunes is launched. And if you ever delete or rename the Automatically Add to iTunes folder, iTunes simply creates a new one for you the next time it is launched.

It even looks for files in subfolders you create and adds them to the library as well. However, it does have a lot of caveats. You can be pretty assured that if the video was downloaded from the Internet, it will not be supported by iTunes. For one thing, iTunes's list of supported formats, especially in the video department, is comically short. In such a case, iTunes will move it to a Not Added subfolder within the Automatically Add to iTunes folder.

Still, there are other problems. But that's to be expected because iTunes has never exactly supported a host of media formats. When users asked for an option to direct iTunes to a folder, they really wanted an option to direct iTunes to any folder. So if you have a huge collection of media in your Movies folder or on an external hard disk drive containing files that you'd like to automatically add to iTunes, you'll still have to move them to that particular folder. What Apple has done, on the other hand, is created a pre-designated folder for the task and not given an option to change it to any other location. What's the point, then?

Well, you say, we can just use the Automatically Add to iTunes folder as our primary movies folder, then-maybe even move it to a location of our choosing, and leave behind an alias to take its place. You can just drag and drop them onto the iTunes icon in the Dock and be done with it. Wouldn't that work? Not only does iTunes not accept anything added to that folder if you move it, but the presence of the alias prevents iTunes from creating a new version of the folder either. Not so much.

And when iTunes does add media files from the Automatically Add to iTunes folder, it moves them into its media folder and organizes them as it normally would, even if you have the option to do so disabled under iTunes's advanced preferences. The only possible use I can see if for you to set it as the default download location for media files you purchase/download off the Internet, so that they can automatically be added to iTunes without your having to do so (and even there, Apple has recommended you don't use it for incomplete files). I hope Apple rethinks this and gives users the freedom to use any folder they want and makes iTunes stop moving the media files around if the user doesn't want it to. It also deletes any subfolders you create within that folder (although that's a logical conclusion, given that they're useless if the media files you put in them never stay there). In short, I don't think the feature is very useful in the form Apple chose to implement it. It's still a (very small) step in the right direction though.

Details on presidential motorcades, safe house for First Family, leak via P2P

Details about a U.S. Secret Service safe house for the First Family - to be used in a national emergency - were found to have leaked on a LimeWire file-sharing network recently, members of the House Oversight and Government Reform Committee were told this morning.

Also unearthed on LimeWire networks in recent days were presidential motorcade routes and a sensitive but unclassified document listing details on every nuclear facility in the country, Robert Boback, CEO of Tiversa Inc. told committee members.

The disclosures prompted the chairman of the committee Rep. Edolphus Towns, (D-N.Y.), to call for a ban on the use of peer-to-peer (P2P) software on all government and contractor computers and networks. "For our sensitive government information, the risk is simply too great to ignore," said Towns who plans to introduce a bill to enforce just such a P2P ban.

Tiversa is a Cranberry Township, Pa.-based P2P monitoring services provider that in the past has served up dramatic examples of highly-sensitive information found on file-sharing networks. In January for instance, the company disclosed how it had discovered sensitive details about the President's helicopter, Marine One, on an Iranian computer after the document leaked over a P2P network.

Today's hearing continued in that vein, with Tiversa providing new sensational examples of leaked information. Boback showed off a document, apparently from a senior executive of a Fortune 500 company, listing every acquisition the company planned to make - along with how much it was willing to pay. Also included in the document were still-private details about the company's financial performance. Boback also showed numerous documents listing Social Security numbers and other personal details on 24,000 thousand patients at a health care system and FBI files, including surveillance photos of an alleged Mafia hit man, leaked while he was on trial. He demonstrated to members of the committee how child predators troll file-sharing networking looking for images and data they could use.

Speaking with Computerworld before the hearing, Boback said that all of the information was readily available on LimeWire's file-sharing network after apparently being leaked. The data on the nuclear sites was found on computers associated with four IP addresses in France, though it is not immediately clear where the data came from. The files containing information about the president had Obama's seal on it and a July date.

Though the information was not classified, it was sensitive enough that under normal circumstances it would not have been available even with a Freedom of Information Act request, he said.

This is the third-time that the House Oversight committee has held a hearing on the topic of data leaks on P2P networks. The last hearing was two years ago and featured similar revelations from Tiversa and others.

The problem is well-understood, but remains difficult to stop. The leaks typically occur when a user installs a P2P client such as Kazaa, LimeWire, BearShare, Morpheus and FastTrack on their computer for the purposes of sharing music and other files with others on the network. In many cases, users inadvertently expose not just the files they want to share, but also every other file on their computers.

Boback and others have warned that leaks have resulted in file-sharing networks becoming a vast treasure trove of information for identity thieves, corporate spies and even foreign intelligence agencies. That's prompted calls for lawmakers to force software vendors to implement stricter security controls in their apps.

The only vendor at today's hearing was Mark Gorton, chairman of the Lime Group, the maker of LimeWire, which is the most-used P2P client available. Gorton testified two years ago and promised at the time to implement changes in the company's products to make it harder for users to inadvertently share files.

Today he insisted that the company had implemented many of those changes and that the latest version of LimeWire makes it much harder for data to be inadvertently leaked. Those claims were largely rejected by members of the committee, who blasted Gorton for failing to live up to his promises.

Pointing to the examples offered by Boback, Towns said that the file-sharing industry's promises to self-regulate itself had clearly failed. "Specific examples of recent LimeWire leaks range from appalling to shocking," Towns said. "As far as I am concerned, the days of self-regulation should be over for the file-sharing industry."

Other members want the issue investigated by the Federal Trade Commission, the Securities and Exchange Commission and law enforcement authorities. They said that the continued failure by companies such as LimeWire to take more proactive steps to stop inadvertent file-sharing is tantamount to enabling illegal activity resulting from the data leaks.

Towns plans to meet with the chairman of the FTC to determine with the failure stop inadvertent file-sharing constitutes an unfair trade practice by P2P companies.

IT pros continue to lose jobs

As the national unemployment rate continues to creep up, the number of jobs cut  in high-tech industries is also increasing across several IT segments tracked by Foote Partners.

View a slideshow of the most notable IT layoffs in 2009

The IT workforce analyst firm commented on recently released U.S. Department of Labor statistics that show the national unemployment rate is nearing 10% with 467,000 non-farm jobs lost in June. Among the jobs being cut are many of those in high-tech industries such as communications equipment, which lost 2,100 positions in June after shedding 600 in May. About 1,100 management/technical consulting positions were lost in June, after 700 were added in May and 1,600 in April.

Potential signs of a slowdown in job cuts could be found in fewer total positions eliminated, Foote Partners suggests. The computer/peripheral equipment industry lost 2,300 jobs last month, fewer compared to the 3,200 eliminated in May. And 100 fewer positions in the computer systems design/related services segment were cut in June (2,700) than in May (2,800). And the data processing/hosting/related services industry added 600 jobs in June, after losing 3,500 positions in May.

"The computer and peripheral equipment segment lost 900 fewer jobs this time, not exactly good news but certainly some encouragement," said David Foote, CEO of Foote Partners, in a statement. "It's important to note that fewer jobs were lost in the bellwether IT job segments, 1,800 fewer than in May to be exact. It supports continued evidence of counter trending in IT employment that we've seen every month since the Wall Street meltdown last October."

Do you Tweet? Follow Denise Dubie on Twitter here. https://twitter.com/DDubie

AMD Chips Used in Iranian HPC for Rocket Research

An Iranian research institute claims that it used AMD Opteron microprocessors to build a high-performance computing system, one more sign that the U.S. trade embargo on Iran isn't hindering that country's ability to import high-tech equipment.

The Aerospace Research Institute of Iran (ARI) posted a document on its Web site that describes a high-performance computer using dual-core chips from Advanced Micro Devices Inc.

The ARI, a government ministry, was founded in 1999 to conduct "aerospace science and technology" research, according to its Web site.

The site says that the SUSE Linux-based HPC system was launched with 32 cores and now runs 96 cores. Its performance was pegged at 192 GFLOPS.

It's unclear exactly when the Iranians started building the system.

"It is more than troubling that an Iranian aerospace entity, affiliated with the government and involved in sophisticated missile research and production, is using U.S. computer equipment for its development work," said Valerie Lincy, editor of Iran Watch, a Web site published by the Wisconsin Project on Nuclear Arms Control.

In a statement, AMD said that it can't speculate as to how the processors could have been shipped to Iran.

"AMD has never authorized any sales or shipments of AMD products to Iran or any other embargoed country, either directly or indirectly," the company said.

Mehdi Noorbaksh, an associate professor of international affairs at the Harrisburg University of Science and Technology in Pennsylvania, said that Iran buys its technology mostly on the black market.

"That market provides Iran with what the authorities need for these projects," Noorbaksh said.

High technology from U.S. companies appears to be widely available in Iran. Various Iranian firms advertise servers, networking products and components from a variety of U.S. vendors on their Web sites.

The ARI disclosure comes two years after the Iranian High Performance Computing Research Center said that it had assembled a Linux-based supercomputer using 216 Opteron processing cores.

This version of the story originally appeared in Computerworld 's print edition.

HP revenue drops in tough climate

Computer industry bellwether Hewlett-Packard reported a 3 percent drop in revenue as its major lines of business continued to be hammered by the global recession.

The company also became the latest technology vendor to resort to layoffs in order to cut costs. Over the next 12 months, HP will lay off about 2 percent of its work force, or about 6,000 employees, HP Chief Financial Officer Cathie Lesjak said during a conference call with financial analysts Tuesday. HP employs 321,000 worldwide.

The company remained profitable, however, posting results that were in line with analyst expectations. HP recorded a profit of US$1.7 billion on sales of $27.4 billion. Earnings per share were $0.70 for its second fiscal quarter, ended April 30.

In a hopeful sign, the company reaffirmed its earlier guidance for fiscal 2009, saying it expected to earn between $3.76 and $3.88 per share for the year. That's better than analysts had been expecting. In a Thomson Financial survey of 26 financial analysts, the consensus estimate was $3.71 for the year. However, the company was pessimistic on revenue for the year, saying it would be down by 4 percent to 5 percent. Last quarter, HP had said it expected revenue to be down between 2 percent and 5 percent.

HP Chairman and CEO Mark Hurd said it was unlikely that corporate IT purchasing patterns would change in fiscal 2009. "We have customers that tell me, 'We're just delaying as long as we can until we have to buy,'" he said during a conference call with financial analysts Tuesday. "CIOs have been given marching orders that say, 'Take that infrastructure, keep the infrastructure running... be very particular about new projects you start, and if you can avoid starting that project, avoid starting it.'"

The quarter's revenue drop would have been much worse had HP not seen its services sales nearly double, year-over-year, thanks to the company's Aug. 26 acquisition of Electronic Data Services (EDS). Services revenue was up 99 percent, totaling $8.5 billion for the quarter.

HP is in the process of cutting 24,600 EDS jobs as it absorbs the computer services giant. The company's EDS integration is ahead of schedule, Hurd said, with "roughly half" of those positions now eliminated.

Everywhere else, however, the financial numbers reflected the global slowdown: storage revenue was down 22 percent; midrange server revenue dropped 21 percent; and sales of the company's industry standard servers and business critical systems were both down 29 percent.

Sales of desktop PCs dropped 24 percent, notebooks were down 13 percent and revenue in the company's printer division was down 23 percent.

The company did see improvements in some areas. "We saw improvement in China, and it was material. We saw improvement in U.S. consumer that I wouldn't say was as material," Hurd said. "I just think we're going to need another quarter of data in order to make a meaningful statement about any upturn or anything like that."

HP posted disappointing earnings last quarter as well, as revenue dropped in all of its business units. Hurd responded by imposing wage cuts across the board at HP. He cut his own salary by 20 percent and those of HP's top executives by 15 percent. The company's remaining executives saw a 10 percent wage cut while all other salaries were slashed by 5 percent.

HP had been hoping that these wage cuts would help it avoid layoffs. In a Feb. 18 memo to employees, Hurd said, "I don't believe a major workforce reduction is the best thing for HP at this time."