New tech trends force government to rethink storage strategies

Regulatory compliance requirements, server virtualization, green IT and cloud computing are creating challenges and opportunities for holding the line on enterprise storage costs.

In Delaware, Freedom of Information Act and lawsuit discovery requests seemed to have one thing in common: They all beat a path to the state’s e-mail storage systems.

Because Delaware electronically archives e-mail messages, producing the sought-after records is a job that has been thrust upon Douglas Lilly’s team in the state’s information technology department. The problem is that their storage management software, like that used by most government entities, was designed to move big batches of files quickly from one storage device to another, not to make that kind of granular data retrieval fast or easy.

“The soft costs that were being forced on our team ... were killing us,” Lilly said, referring to the labor-intensive process of hunting down e-mail. “Our [storage] software is just designed for disaster recovery, not so much for compliance.”

The growing cost of meeting regulatory compliance requirements is only one of several forces driving government IT executives like those in Delaware to rethink their traditional storage practices and architectures. A wave of server virtualization deployments in the data center, demand for more energy efficient IT, and emerging cloud-computing options also create challenges and opportunities for holding the line on enterprise storage costs.

Without action, new costs could add up fast. Agencies already spend between 5 percent and 15 percent of their annual IT budgets on storage-related capital and labor costs, industry analysts say.

The assignment for IT executives: seek cost-cutting strategies that address not only the technical solution but also incorporate the people and processes involved in storage infrastructure.

Dealing with compliance

Delaware’s initial response to the record-request crunch was to press its existing data backup and disaster recovery software into service as an e-mail discovery solution, but that approach was inadequate, said Lilly, the lead telecom technologist for the state’s Department of Technology and Information.

He and his staff are the stewards of the Microsoft Exchange e-mail system used by more than 18,000 state employees. He pointed to the example of a request for a year’s worth of e-mail for 30 users — a fairly common query. If the users’ mail was scattered across multiple Exchange mail stores, a discovery job of that size could take as long as six to eight hours per user using the backup software as a retrieval tool.

To alleviate this burden, Lilly and his team evaluated e-mail archiving systems. After testing systems from all the major players in the market, they selected Mimosa’s NearPoint e-mail archiving solution and deployed it late last year.

Lilly said NearPoint, coupled with an Exchange server consolidation, will shrink the e-mail searching windows from hours to about 15 minutes. Those dramatic results, however, won’t arrive until the archive has a year’s worth of employee e-mail stored.

In addition to speeding discovery requests, the e-mail archive provides for more reliable retrieval. The backup and disaster recovery system the staff used could miss e-mail messages deleted between backup sessions. Lilly also cited the advantage of NearPoint’s ability to track the history of e-mails flowing through the organization. The archive now tracks the progress of e-mail as messages are forwarded from person to person.

But as agencies retool technology for compliance, they should also plan for the ongoing management of archiving systems. Jim Smid, data center practice manager at Apptis Technology Solutions, said most government bodies struggle with how to do e-mail archiving and who should be responsible for it. To wit: Will the job go to an Exchange administrator, storage administrator or compliance officer?

“E-mail archiving really requires somebody who is responsible for the care and feeding of those systems,” Smid said.

Virtual impact

Server virtualization has swept through government agencies looking to consolidate servers and trim expenses. This approach partitions a single physical server into multiple virtual machines that can independently run operating systems and applications.

Virtualization on the server side has significant ramifications for storage. Jim Damoulakis, chief technology officer at storage consulting firm GlassHouse Technologies, said server virtualization challenges organizations to redefine the IT infrastructure, storage included.

“It definitely has a substantial impact on how storage is planned and designed and implemented,” he said.

For example, virtual server environments might decrease the need for locally attached storage, because there are fewer physical servers to connect to, but they might trigger increased demand for shared storage-area network capacity. That is the case in Pennsylvania, where the commonwealth is using server virtualization to dramatically reduce the population of physical servers.

Tony Encinias, Pennsylvania's chief technology officer, said the state has boosted its SAN storage capacity in light of virtualization. To keep costs in check, the state takes a tiered storage approach.

High-performance disk-based storage — the most expensive variety — is reserved for production environments requiring fast access to data. Less expensive storage devices are assigned to areas such as application testing that can make do with lower performance. Unisys is helping the state with its storage strategy.

“With the amount of footprint we have — and plan to have — in the virtual environment, we can’t afford to go across the board with high-performance storage,” Encinias said.

Virtualization also affects storage management processes and staff responsibilities. Damoulakis said storage managers could carve out large quantities of storage capacity on a storage device and hand it to the server virtualization team to manage. The storage administrators, as a result, lose the visibility into resource utilization that they had when storage devices were connected to physical, stand-alone servers.

The upshot is server and storage administrators must work closely together to make sure virtual servers have the storage resources they need.

“Those folks are going to have to really team up,” Encinias said. “They are going to have to work together to make sure we have an optimal configuration.”

Technologists also must work together to devise backup solutions. As virtual machines proliferate, so too does data redundancy. What’s more, virtual machines contend for a physical server’s shared resources — memory and network cards. A data-intensive process such as backup can overtax the physical server’s resources.

“The backup problem becomes more and more serious” as the number of virtual servers grows, Damoulakis said.

Industry consultants view the new crop of data deduplication technology as a potential solution. Deduplication products are designed to eliminate redundant data, which Damoulakis said eases the resource contention issue and reduces storage space and costs.

Encinias said Pennsylvania is looking at deduplication as a potentially money-saving tool as its virtualized environment expands.

Green challenge

There was a time not long ago when green IT focused mainly on servers — the low-hanging fruit in the data center, Smid said.

Doug Chabot, vice president and senior solutions architect at QinetiQ North America Mission Solutions Group, said data centers now take a big picture view of environmental impact, incorporating energy consumption into return-on-investment calculations.

“That is a big shift,” he said of data centers’ conservation moves. “They are looking at it across the board — not just servers but data storage, too.”

Virtualization often serves as the gateway to green computing. On the server side, the ability to run multiple virtual machines on a single server lets organizations reduce the population — and therefore the power consumption — of physical servers.

Similarly, virtualization paves the way for data consolidation, which is where green demands can have an impact on storage. Data once housed in stand-alone storage devices tethered to individual servers migrates to the network into shared storage devices. Technologies such as SANs let organizations eliminate isolated storage silos.

Officials in Scott County, Minn., discovered that their virtualization and data consolidation strategy cut back on power consumption. The fast-growing county, which lies about 25 miles southwest of Minneapolis, found that its data center’s power transformer was nearing capacity. The IT shop also contended with concerns over power supply and grid capacity during periods of high demand.

Scott County adopted VMware virtualization software to consolidate 45 servers to five, said Perry Mulcrone, the county's deputy CIO. Mulcrone’s staff migrated data from multiple direct-attached storage devices to two SANs from Compellent.

The power grid issue dissipated as the number of devices drawing power decreased, he said. The county also avoided the expensive proposition of replacing the data center’s transformer.

Green IT harbors the potential for savings. But Steve Mackie, president and chief executive officer of data storage consultant Storage Strategies, cautioned buyers to be wary of vendors touting green capabilities in their marketing literature.

“IT managers should do their homework and quantify the value behind an investment in green architecture and validate any [vendor] claims,” he said.

Clouds coming

Cloud computing offers an option for organizations weary of purchasing and managing their own storage. In this architecture, a service provider hosts the storage capacity, which users access via the Internet. Vendors include Amazon.com.

Chabot said the agencies he talks with want to outsource storage because of the potential savings involved.

“We see a strong interest in cloud computing…but the trend is not yet there,” he said.

A key stumbling block is government's worry about the safety of data transferred to an outside party. One of the issues is how to certify and accredit systems and infrastructure running in a cloud, according to government and industry executives. Federal agency systems are required to pass a C&A process.

Although security provides an obstacle today, Art Fritzson, a vice president at Booz Allen Hamilton, said he believes it might actually drive the government’s adoption in the future. Agencies might find the expense of securing their own storage systems an argument in favor of the cloud option, if the service provider can do it more cost effectively, he said.

In the meantime, some government IT departments have started using the storage-as-a-service model, not from an outside party but internally as a way to more accurately account and charge for the storage and computing resources used by different parts of their organizations.

Such services might use internal networks instead of the public Internet but resemble the cloud approach. Some agencies plan to adopt the price lists and service-level agreements commercial cloud vendors provide their customers.

The Defense Information Systems Agency, for example, launched its first venture in internal cloud computing last fall and now eyes storage as an area for expansion.

Phase I of DISA’s Rapid Access Computing Environment (RACE), which became operational in October, lets agency customers purchase access to a full computing environment on a month-by-month basis at a base price of $500 per server.

Henry Sienkiewicz, technical program adviser at DISA's Computing Services Directorate, said user responses have been positive and, as expected, additional functions have been requested. One of the most requested functions, he noted, is additional storage capacity beyond the standard 50G per server that RACE now provides. Another request is for a standardized data backup service.

The additional storage and backup capabilities are slated to arrive in April or May, Sienkiewicz said. DISA will use existing disk and tape resources to meet RACE’s storage requirements.

In another example, the Transportation Department’s Pipeline and Hazardous Materials Safety Administration (PHMSA) is creating a virtualized computing and storage environment that it will offer as a service to other DOT agencies.

Steve Grewal, information systems security officer at the PHMSA, said the administration plans to initially focus on servers used for test and development when it launches the pilot program in the next 60 days. Other DOT agencies will be able to access PHMSA’s virtual machines and storage through the department's network backbone.

“The idea is to demonstrate that it is reliable and stable,” Grewal said.

While the program is introduced, PHMSA will develop a formal cost and support model, which will establish a price list and a service-level agreement.