The end of add-on storage

How to treat storage tech as a strategic asset, not an afterthought.

Storage, through the years, has been cast as an extra -- something purchased as a server addition that settles into the background.

The proliferation of data, however, has raised the profile of storage. The market research firm Meta Group Inc. reports that enterprise data is growing at a rate of up to 125 percent annually. As organizations contend with growth, storage consumes larger portions of their information technology budgets. Increasing pressure to protect more data for longer periods of time has also pushed storage into the spotlight.

In the federal sector, the focus on storage is perhaps as great as it has ever been. Agency officials are looking to consolidate and share storage resources to improve utilization. Many government entities are also seeking improved backup and disaster recovery plans.

But industry executives caution agencies against plunging headfirst into ambitious storage projects. Their recommendation: Conduct a storage assessment first. An assessment helps agency officials evaluate their storage capacity, and it describes storage goals. If properly executed, the assessment should provide a road map -- and a price tag -- for achieving those goals.

The necessity of a storage assessment, however, isn't embraced by all storage buyers. "A lot of customers jump to a solution and force-fit it into their environments," said Dan Smith, enterprise technology consultant for GTSI Corp.'s storage technology team.

But more federal agencies appear to be buying into the assessment concept.

"In the federal sector, there is a huge focus on this right now -- more so than even in the commercial sector," said Brendan Reilly, chief technology officer and vice president of consulting services at Sanz Inc., a storage systems integrator. The company has performed storage assessments for the Defense Logistics Agency, the Environmental Protection Agency and the Marine Corps, among other agencies.

"They are all searching for answers on how to do disaster recovery better and manage information better," Reilly said.

Basic questions

To uncover those answers, a storage assessment typically starts with a few basic questions: What goals does an organization have? What are the expectations for data growth? What is driving the demand for storage?

"The biggest thing is to understand the overall objectives of the customer," said Jon Wehse, business continuity practice manager for the federal division at storage vendor EMC Corp. On an assessment, the company deals with senior management executives to get the big-picture perspective, Wehse said. EMC officials ask customers about issues such as data accessibility, application availability and business continuity.

This questioning gives the company an idea of a customer's storage needs. At GTSI, for example, company officials interview those who work closely with customers' processes to flesh out high-level storage requirements, Smith said.

With this high-level view on hand, storage consultants zoom in on the specifics of storage: How much data does an organization possess? On what devices is it stored? How is the data managed and protected?

"In most IT infrastructures, the answer is, 'I don't know,'" Reilly said, adding that few organizations grasp how much data they have and whether it is properly protected. But to improve a storage environment, "you need to have those answers first," he said.

To that end, the storage assessment enters an information-collection phase. Consultants use manual methods and automated tools to determine how a customer stores and manages data. A storage resource management tool can find wasted storage space at customer sites, said Jim Geis,

director of system solutions and services at the consulting firm Forsythe Technology Inc., which offers storage-assessment services.

During the course of an assessment, evaluators compile details on connectivity methods, backup technologies, management software and storage components, such as storage-area networks, network-attached storage and direct-attached storage. The resulting overview can reveal some surprises, executives said.

Wehse said large organizations may "end up with a large disparity between what they think they have and what they really do have." One check on server-attached storage found 750 servers when the customer expected to find only 350, he said.

This discovery process may require traveling by the consultant. Reilly said Sanz consultants visited 19 locations worldwide to inventory the Marine Corps' data holdings. But the effort helps customers learn how effectively they use storage. Weaknesses in data backup approaches can also be uncovered.

The data-collection stage details the state of an organization's storage architecture -- what a business process specialist would call the as-is environment. What follows is a description of the desired future state, or the to-be storage architecture.

Storage directions

An important step toward determining the future storage environment lies in defining what Smith calls operational requirements. "You go in and look at the detailed, granular requirements for individual applications and systems that depend on storage," he said.

Different applications and data call for different varieties of storage infrastructure. Reilly said that putting all of an organization's data in a highly replicated, high-availability environment would cost a considerable sum of money, which would largely be wasted. He said only about 15 percent of an organization's data requires instantaneous recovery.

A multitiered storage architecture, however, lets customers place their data on the most cost-effective platform. Mission-critical data, for example, could be housed on high-end, high-performance storage devices, while data of lesser importance could be stored on lower-performing and, therefore, less expensive storage devices.

"A lot of these technologies are expensive," Wehse said. The key is to apply an appropriate technology.

For that reason, an assessment may also help customers categorize their data.

Data can be organized according to its value or mission importance and then assigned to the appropriate storage tier.

The to-be architecture emerges from that categorization process, in which performance and availability requirements are defined for each storage tier.

"The to-be architecture consists of a list of all the capabilities that a system would need to support all those requirements," Smith said.

At this stage, the storage-assessment project focuses on analyzing the gap between a customer's as-is and to-be storage architectures. The consultant will typically make a series of storage recommendations, which provides a path to a storage solution.

"It's usually broken down into multiple projects, in a specific order, to get you to that end state," Geis said.

Reilly said his company offers low-cost, medium-cost and high-cost recommendations to clients. The low-cost recommendation emphasizes having the proper storage management policies and procedures in place.

"Most of the environments have very limited policies and procedures on how they manage data," Reilly said. So just implementing a storage management policy "can improve infrastructure fairly dramatically."

Higher-cost recommendations, meanwhile, point customers toward more ambitious storage projects, such as consolidation or new disaster recovery provisions. An agency may issue a request for proposals, based on the requirements uncovered during the assessment.

The bottom line: "Any assessment should be able to provide a road map," Reilly said.

Costs and benefits

To obtain the storage road map, agency officials must be prepared to invest time and money. Reilly said a storage assessment for a large corporate or government enterprise may take six to seven months to conduct and can cost $500,000 to $1 million. An assessment for a smaller organization may take eight to 10 weeks and cost $75,000 to $150,000.

Given that investment level, a formal assessment makes the most sense for organizations with more than 10 terabytes of data or organizations with storage in five or more locations, Reilly said.

The primary benefit of a storage assessment -- in addition to properly framing a storage project -- is savings.

"If you do a good job in an assessment, [organizations] can save between 20 [percent] and 35 percent of their storage-

management costs," Reilly said.

Much of the savings stems from the discovery and subsequent consumption of underutilized storage. Reilly said some customers use less than 35 percent of their storage capacity. As the result of one assessment, a customer was able to improve the use of existing assets and avoid the purchase of additional disk storage for more than a year, he said.

Another area for potential savings lies in the introduction of new tools and procedures for automating or streamlining certain storage-management tasks. Defense Logistics Agency officials, for example, enlisted Sanz last year to help find ways to add new storage capacity while maintaining administrative staff levels.

Storage re-engineering can yield savings and improve efficiency, industry executives said. But to do that, customers have to gather intelligence to make the right decisions, according to Reilly. "And the only way to have that intelligence is to do a storage assessment," he said.

Moore is a freelance writer based in Syracuse, N.Y.