HP to focus on services with new print division

Hewlett-Packard on Monday formed a new print services division with a focus on managing print and imaging hardware and software in enterprises. The unit will also provide services and software that put scanned or printed documents in workflow systems to make document management easier. The division, called Managed Enterprise Solutions, aims to unify disparate hardware such as copiers, printers and scanners in order to cut hardware and printing costs, said Vyomesh Joshi, executive vice president of HP's imaging and printing group.

The company's attention has been geared toward hardware and supplies, but software and services surrounding printing and imaging are a growing opportunity, Joshi said. There is more to printing than just hitting the print button, said Roger Douglas, director of managed print services at HP. For example, software provided with the managed services could enable an invoice to be scanned, which can automatically be put into a company's payroll system. The company sees a US$121 billion annual opportunity in the printing market, of which $64 billion is for hardware and $57 billion for software and services. The automation reduces the number of steps and cost required to manage the document, Douglas said. The documents can also be secured through a service by establishing a status to ensure documents aren't appended, Douglas said.

It also reduces the chance for error through manual transcription. For example, if a marketing logo is finalized on a particular document, its status can be appended to ensure no one changes it. The company is also changing printer designs to build in more services-related functionality. This approach is particularly helpful when editing legal documents, he said. For example, a touch screen on multifunction printers can be used to input or check the job status of scanned documents like patient records. "A lot of times customers have treated imaging and printing like an afterthought," Douglas said. The company has also expanded the availability of a program that guarantees savings for customers who sign up for its print services outside the U.S. Under the plan, HP assesses a company's imaging and printing environment and calculates the possible savings a company can realize using HP's managed services.

Managed print is all about stepping back and taking a more strategic and methodical look at how those documents are managed, he said. If customers haven't realized the savings in a year, HP will make up the difference with a credit that can be used for their next printing services contract. The unit will be a part of the company's imaging and printing division, Joshi said. The company has already signed up 100 customers since it launched that program, Joshi said. The company has pulled some personnel from the existing services division and has seen its services customer base expand since acquiring EDS. HP has a strong presence in the printer market, and the expansion of services could help the company capture a larger share in the printer space, said Edward Crowley, CEO of Photizo Group, who was at HP's press briefing Monday.

The increased level of focus on services could also benefit HP's enterprise customers, he said.

States scramble to track federal stimulus bucks

There's no such thing as a free lunch, especially for IT. Go to the state of Iowa's Web site, and you can see that of the $2.5 billion in federal economic stimulus money earmarked for the state under the American Recovery and Reinvestment Act of 2009 (ARRA), $553 million has already been spent on health, education, infrastructure and other programs designed to create jobs and jump-start the local economy. Iowa CIO John Gillespie figures his IT organization has devoted about 800 man-hours so far to making that data available to the state's citizenry. "We actually had to build the application to give [different state agencies and programs] a way to submit data to us," he says. Drill a bit deeper into the data and you can pull up the exact amounts spent on weatherization training and technical programs, rental assistance programs and hundreds of other individual projects.

But the far bigger challenge, Gillespie says, has been building the business rules and defining internal processes to comply with federal reporting requirements, which have changed or been updated several times since the stimulus package was first announced in February. Just as the $787 billion ARRA is unprecedented, so are the reporting demands it's making on state CIOs and IT organizations, which are scrambling to whip up new processes and tools to accurately track and account for their states' shares of the stimulus pie. States are required to file quarterly reports that fully account for every tax dollar spent. The process has been complicated by a variety of factors, including exceedingly tight deadlines and complex and changing federal reporting guidelines. Like so many of the energy and construction projects launched with stimulus dollars, tracking and reporting systems remain works in progress.

The best state Web sites for ARRA tracking States with the best ARRA-tracking Web sites, as of July: 1. Maryland (see related story) 2. Colorado 3. Washington 4. West Virginia 5. New York 6. Pennsylvania - Mitch Betts Source: study by Good Jobs First (PDF), Washington, July 2009 Another big problem is the lack of a central accounting system in most states, which have had to first devise ways of extracting and aggregating data from multiple systems across hundreds of agencies before rolling it up to report it to the federal government. In Missouri, one of a handful of states to have a central accounting system used across all state agencies, funding and budgeting data is relatively easy to access. Two data points the feds want to track are job creation and retention under the economic stimulus program. "But the definition and requirements for how to count jobs is quite a challenge to understand," according to Marilyn Gerard-Hartman, director of enterprise applications for Missouri. What remains difficult to grasp, however, is precisely what the federal government wants to know, says CIO Bill Bryan. For example, if the state awards a highway infrastructure project to a contractor who in turn hires a subcontractor, who in turn hires other subcontractors, "how far down the chain is the state responsible for tracking? Meanwhile, fulfilling the requirements to the letter of the federal law is critical, Bryan notes. "If you don't comply, you could get thrown under the bus and not get any further funding." "One of the biggest challenges is just the speed at which we had to get things done," says Iowa's Gillespie. "The rules for the most part didn't get finalized literally until weeks - not months - ago.

And do you only count it as a job created if the job wouldn't have existed without the ARRA funding?" Generally, "it hasn't been clear what the requirements are until fairly late in the game," adds Bryan. Just keeping up has been the biggest challenge." Rather than licensing commercial stimulus-tracking tools, Gillespie's team internally developed a tracking and reporting system "using tools already familiar to financial folks who have all the data in Excel spreadsheets," he explains. First things first But before IT could build the tracking system, "we actually had to build a Web-based application to give people a way to submit data to us," Gillespie explains. The data is imported into a database, where it is aggregated, extracted and converted to an XML-formatted report and submitted online to the federal government. Stimulus reporting: The basics More than two-dozen federal agencies have been allocated a portion of the $787 billion in stimulus money.

The federal agencies are required to file weekly financial reports on how they're spending the money and their specific activities involving ARRA funds. Each federal agency develops specific plans for its share, then awards grants and contracts to state governments or, in some cases, directly to schools, hospitals, contractors or other organizations. Last month, states and other grant recipients for the first time filed the quarterly spending reports required under the law. As Nabors puts it: "Between OMB and the vice president's office and others in the White House and out in the agencies, we've done 169 different conference calls with recipients. The executive branch has worked hard to ensure a smooth reporting operation, according to Rob Nabors, director of the federal Office of Management and Budget.

There've been 170 events with state officials. There have been seven White House forums, and there've been 20 separate Recovery Act reporting training sessions. We've had 37 different events with local government officials. That by itself is an unprecedented effort by the federal government to make sure that we get it right, and this was something that started all the way back in February." - Julia King All of this was done in a matter of weeks by a small team comprising a designer-architect, a programmer and a project manager who is the chief liaison between IT and the state's stimulus office. But given the fierce push to quickly distribute ARRA funds and get new projects up and running, traditional IT project management practices, such as having a comprehensive set of user requirements, have in some cases gone out the window. The team already had some experience from working on the state's recovery Web site, which Gillespie says has "been kind of an iterative process that has been going on since the first recovery money came out." He chalks up the speedy rate of progress on both projects to what he describes as healthy competition among states to have good reporting and great Web sites. "We wanted to be better than everybody else," he says.

In Missouri, for example, a team was in the midst of implementing Microsoft Corp.'s Stimulus360 software for tracking funding and projects when the feds issued a change in data models for reporting. "We had to move forward with plans and put things in place even though you knew [more] changes were coming," says Gerard-Hartman. Tracking and reporting how many jobs are created with ARRA funds is a prime example. Like a start-up At the newly formed Massachusetts Recovery and Reinvestment Office, Deputy Director Ramesh Advani likens the fast pace and deadline-driven atmosphere and culture to the environment of a start-up company. "When you're doing something for the first time, you deal with systems issues, people issues, deadline issues, and you come across something new every day," he says. Meeting the first federal reporting deadline of Oct. 10 required some on-the-fly tactics. "For the first round of reporting, what we have done is develop some manual templates, which we issued out to state agencies. That data will then be gathered and uploaded into a recently launched central database, then uploaded through XML to the federal system," he explains.

Each agency must use the same template and also pass it on to subrecipients and vendors. But this is only for the first round of reporting, Advani emphasizes. We want to make sure we can use the same tools and reporting database beyond ARRA," Advani says. For the long term, the state is developing an automated data-gathering and -analysis system that includes Oracle Corp. business intelligence tools, data marts and data warehousing. "We're trying not to make this a short-term solution, because ultimately we want to upgrade how we do grants management and our budgets across the board. ARRA timeline: 2009 * Feb. 17: The American Recovery and Reinvestment Act is signed into law. For example, funding for ARRA transportation and highway projects had a 120-day "use it or lose it" deadline.

The federal government's Recovery.gov Web site goes live. * Feb. 19: Federal agencies begin announcing block grant awards. * May 17: Agency and program plans are posted on Recovery.gov. * May - October: ARRA stimulus money is distributed to states and other recipients. * May - August: Reporting requirements and updates are developed and distributed. * Sept. 28: Recovery.gov is relaunched with geographic mapping. * Oct. 10: The first quarterly deadline for recipient reporting is reached. * Oct. 30: Recipient grant and loan data is posted. - Julia King ARRA's tight timeline and strict reporting deadlines have already driven some key process improvements in the state. But it typically took the state between 100 and 300 days to advertise projects and solicit bids from contractors. "We ended up drilling down into the system to get it down to a 40-day process, which is something we're proud about," Advani says. Going forward, the office will oversee all activities involving grants reporting, monitoring and compliance. Another long-term improvement is the creation of the program office itself. The goal is to increase overall information transparency "beyond what's expected for federal reporting," says Advani.

Before ARRA, for example, the state didn't provide electronic versions of contracts on its Web site - which is a requirement under ARRA. "Now, we'll take that technology and process new contracts through the same system so we can provide broader information on all contracts that have been awarded," he says. Maine CIO Dick Thompson says he's already been directed by a legislative committee to ensure that IT work done to meet ARRA reporting requirements is also used to increase information transparency statewide. The result of ARRA reporting, CIOs agree, is a lot like the road signs popping up that say: "Temporary Inconvenience, Permanent Improvement."

Network-based e-mail – Ready for prime time?

In the prior newsletter, we raised the question of whether the time is here – or past due – for moving e-mail from local PCs back to the network. Security: Of course, this is always the first question for any public e-mail services (such as Amazon). Is your "private" e-mail really private? This time we want to continue the discussion by looking at some of the key questions that need to be addressed.

In truth, our answer is "probably not." However, anything that has ever transited the Internet is likewise probably not truly private. Data security: Yet another way of looking at "security." How difficult would it be for someone to hack into your public cloud-based e-mail? In reality, any illusion of true security is probably just that – an illusion. Given enough time and enough tries at a given account, the answer would be "not very." However, just to put this into perspective, what is the relative risk of someone hacking an online account vs. having a notebook computer (containing the same information) lost or stolen? For the SMB, the public services probably are quite appropriate.

Private or public cloud: This is a tough one, and a lot depends on scale. For larger shops, it's a more complex call. This alleviates the necessity of having local servers, maintaining these servers, backing up on a regular schedule… This is basically the same as any other cloud application. That said, we've seen numerous shops totally "outsourcing" e-mail to services like Google. Storage availability: One of the major reasons years ago for moving to a PC-based service was that network storage was limited and expensive. Now, however, even the free version of Gmail offers individuals over 7GB of storage (with a constantly incrementing counter.) And additional storage is available at an "almost free" price.

Now, however, that's no longer a stumbling block. Compliance: A great question. However, our initial take is that compliance with various regulations can be handled once by the cloud-based organization and then applied for multiple customers. And one that we'll be looking for your input on. More on this to come.

For now, the services are looking "good." And we're eagerly awaiting checking out "Google Wave" as a look at the next generation. Integrated interfaces and collaboration: Clearly, this is an area where we'll be seeing significant interest in the near future. Personalization: Right. No problem. You don't want to have your corporate image as [fill-in-the-blank]@gmail.com or [fill-in-the-blank]@yahoo.com. If it's a private cloud, then you still have your own servers.

And this is only a starting point for this issue. And even if it's a public cloud, it's trivial to personalize with your own domain name. For a continuation, we invite you to join our discussion on this topic at TECHNOtorials. Com.

Microsoft correctly predicts reliable exploits just 27% of the time

Microsoft's monthly predictions about whether hackers will create reliable exploit code for its bugs were right only about a quarter of the time in the first half of 2009, the company acknowledged Monday. "That's not as good as a coin toss," said Andrew Storms, director of security operations at nCircle Network Security. "So what's the point?" In October 2008, Microsoft added an "Exploitability Index" to the security bulletins it issues each month. The idea was to give customers more information to decide which vulnerabilities should be patched first. The index rates bugs on a scale from 1 to 3, with 1 indicating that consistently-successful exploit code was likely in the next 30 days, and 3 meaning that working exploit code was unlikely during that same period. Before the introduction of the index, Microsoft only offered impact ratings - "critical," "important," "moderate" and "low" - as an aid for users puzzled by which flaws should be fixed immediately and which could be set aside for the moment.

Microsoft also tallied its predictions by security bulletins - in many cases a single bulletin included patches for multiple vulnerabilities - to come up with a better batting average. "Sixteen bulletins received a severity rating of Critical," it said in its report. "Of these, 11 were assigned an Exploitability Index rating of 1. Five of these 11 bulletins addressed vulnerabilities that were publicly exploited within 30 days, for an aggregate false positive rate of 55%." The company defended its poor showing - even on a bulletin-by-bulletin level it accurately predicted exploitability only 45% of the time - by saying it was playing it safe. "The higher false positive rate for Critical security bulletins can be attributed to the conservative approach used during the assessment process to ensure the highest degree of customer protection for the most severe class of issues," said Microsoft. "There's some validity to that," agreed Storms. "They're going to err on the side of caution, if only to prevent people saying 'I told you so' if an exploit appears later." John Pescatore, Gartner's primary security analyst, agreed, but added, "If they want to stick with the index, they need to adjust the criteria so fewer vulnerabilities get a '1.'" With vulnerability-by-vulnerability predictions correct only a fourth of the time, Storms questioned the usefulness of the exploitability index. "What's the point of the index if they're always going to side on the more risky side, as opposed to what's most likely?" he asked. "In some ways, we're back to where we were before they introduced the exploitability index." From Storms' point of view, the exploitability index was meant to provide more granular information to customers who wondered what should be patched first. But in the first half of this year, Microsoft correctly predicted exploits just slightly more than one out of every four times. "Forty-one vulnerabilities were assigned an Exploitability Index rating of 1, meaning that they were considered the most likely to be exploited within 30 days of the associated security bulletin's release," Microsoft stated in its bi-annual security intelligence report , which it published Monday. "Of these, 11 were, in fact, exploited within 30 days." That means Microsoft got it right about 27% of the time. Presumably, a vulnerability marked critical with an index rating of "1" would take precedence over a critical vulnerability tagged as "2" or "3" on the exploitability index. "With these numbers of false positives, we are in no better place than we were prior to the index, in respect to granularity," he said. Instead, Pescatore again argued, as he did last year when Microsoft debuted the index, that the company would better serve customers by abandoning its own severity and exploitability rankings, and move to the standard CVSS [Common Vulnerability Scoring System] ratings. Pescatore also questioned the usefulness of the exploitability index. "I doubt anyone even looks at it," he said.

The CVSS system is used by, among other companies and organizations, Oracle, Cisco and US-CERT. "Because Microsoft does its own exploitability index, enterprises can't compare theirs with Adobe's or Oracle's. It's an apples and oranges thing then," said Pescatore. "It's not just Windows bugs that companies have to deal with anymore." He doubted Microsoft would take his advice. "They don't want to do that because then reporters and analysts can look and say, 'Microsoft has more higher-rated vulnerabilities than Oracle or Adobe,'" he said. "There's nothing in it for them to do that." Microsoft made the right call on all 46 vulnerabilities that were assigned an exploitability rating of "2" or "3," which indicate that an exploit would be unreliable or unlikely, respectively. "None were identified to have been publicly exploited within 30 days," Microsoft's report noted. Microsoft's security intelligence report, which covers the January-June 2009 period, was the first to spell out the accuracy of the exploitability index. If all its predictions in the first half of 2009 are considered, not just those marked as likely to be exploited, Microsoft got 57 out of a possible 87, or 66% of them, right. But Microsoft has touted its forecasting before. Microsoft's security intelligence report can be downloaded from its Web site in PDF or XPS document formats.

A year ago, for example, Microsoft said in a postmortem of its first-ever index that although it had accurately predicted exploits less than half the time, it considered the tool a success . "I think we did really well," said Mike Reavey, group manager at the Microsoft Security Research Center (MSRC), at the time.

Cloudera intros Hadoop management tools

Startup Cloudera is introducing a set of applications on Friday for working with Hadoop, the open-source framework for large-scale data processing and analysis. It allows an application workload to be spread over clusters of commodity hardware, and also includes a distributed file system. Cloudera, which provides Hadoop support to enterprises, developed the new browser-based application suite to simplify the process of using Hadoop, according to CEO Mike Olson. "It's an easy-to-use GUI suitable for people who don't have a lot of Hadoop expertise," Olson said. "The big Web properties with sophisticated and talented PhDs have been successful [with it], but ordinary IT shops ... have had a harder time." Hadoop is known for its behind-the-scenes role crunching oceans of information for Web operations like Facebook and Yahoo. But although the technology is "at its best" when data volumes get into multiple terabytes, Hadoop has relevance for a wide variety of companies, according to Olson. "It's increasingly easy to get your hands on that much data these days," especially from machine-generated information like Web logs, he said.

Cloudera and its partners are fine-tuning the suite, which is now in beta, before issuing a general release. The browser-based application set is supported on Windows, Mac and Linux, and includes four modules: a file browser; a tool for creating, executing and archiving jobs; a tool for monitoring the status of jobs; and a "cluster health dashboard" for keeping tabs on a cluster's performance. Hadoop needs many more tools like it, according to analyst Curt Monash of Monash Research. "If Hadoop is to consistently handle workloads as diverse and demanding as those of [massively parallel processing] relational DBMSes, it needs a lot of tools and infrastructure," Monash said via e-mail. "The three leaders in developing those are Yahoo, Cloudera, and Facebook. There's a long way to go."

Nasty banking Trojan makes mules of victims

A sophisticated Trojan horse program designed to empty bank accounts has a new trick up its sleeve: It lies to investigators about where the money is going. It rewrites bank pages so that the victims don't know that their accounts have been emptied, and it also has a sophisticated command-and-control interface that lets the bad guys pre-set what percentage of the account balance they want to clear out. First uncovered by Finjan Software last week, the URLzone Trojan is already known to be very advanced. But Finjan isn't the only company looking into URLzone.

Researchers typically create their own programs that are designed to mimic the behavior of real Trojans. RSA Security researchers say the software uses several techniques to spot machines that are run by investigators and law enforcement. When URLzone identifies one of these, it sends it bogus information, according to Aviv Raff, RSA's FraudAction research lab manager. When URLzone spots a researcher's program, instead of simply disconnecting from the researcher's computer, the server tells it to do a money transfer. Security experts have long published research into the inner workings of malicious computer programs such as URLzone, Raff said. "Now the other side knows that they are being watched and they're acting," he said. But instead of transferring the money into one of the criminal's money mules - people who have been recruited to move cash overseas - it chooses an innocent victim.

So far, more than 400 legitimate accounts have been used in this way, RSA said. Typically, these are people who have received legitimate money transfers from other hacked computers on the network, Raff said. The idea is to confuse researchers and to prevent the criminal's real money mules from being discovered. According to Finjan, URLzone infected about 6,400 computer users last month and was clearing about €12,000 (US$17,500) per day. Banking Trojans such as Zeus and Clampi have been emptying accounts for years now, but Finjan dubbed URLzone the first of a new, smarter generation of the crimeware.