Metrics matter when setting consolidation goals

Agencies need to look before they leap into consolidation
 

Through data center virtualization and consolidation, agencies are supposed to achieve efficiencies and cost savings, which are the goals of the Federal Data Center Consolidation Initiative, But first, they must know the best opportunities for consolidation.

To do that, they need to accurately measure the assets they have in their data centers and determine how the assets contribute to overall costs and performance.

“Ideally, agencies want the big picture in mind when they start a consolidation project,” said Jim Smid, data center practice director at Iron Bow Technologies. “They want to make sure they don’t waste money on the effort, and they need to make sure that consolidation will be part of their overall solution for IT two or three years down the road.”

With the kind of budget constraints that are appearing now, agencies can’t start investing in these kinds of projects without knowing that they will pay off, so they must have that final destination in mind at the start, he said.

And that is a problem for many agencies. Smid said that when he works with government customers, pulling that big picture together is often a major stumbling block to consolidation projects. There’s usually no one source who has knowledge of the full range of data center metrics, such as power consumption, ongoing maintenance costs or what it would cost to bring new servers into the mix.

Two “Measure to Manage” reports produced this year by MeriTalk pointed to various reasons for that situation. The first, published in April, showed that agencies across the government use at least three different definitions for data centers. Although just about all agencies have approved sets of criteria to identify consolidation opportunities, they disagree on the best metrics to use.

In the end, the MeriTalk report concluded, only two in five agencies have a clear picture of what their consolidation costs will be.

The second report in the series, published at the end of June, painted an even more disturbing picture. Not only did most agencies not have the metrics in place to track efficiencies and potential savings for things such as energy, storage use and capacity allocation, fewer than half could say what incentives there were for data center consolidation, including the savings that would be realized in budgets outside IT.

There are areas of agency spending, such as on energy and real estate, that the IT department does not control, said NetApp’s Mark Weber, so it’s reasonable to assume they don’t care about those things. But they can control many other things.

“The best thing agencies could do, frankly, is to leverage industry best practices for data center consolidation and provide their IT departments with incentives to do so,” Smid said. “In private industry, IT is more of an expense, but it’s also seen as a competitive advantage. They have really good tools to measure these kinds of things so they can get the maximum out of every dollar they invest.”

On the other hand, in the public sector, if agency divisions such as IT are provided with money to spend, they tend to spend it all, he said.

And agencies need as wide ranging an understanding of their environment as possible, said Gartner’s Massimiliano Claps, because many things need to be considered at the same time.

“There’s a parallel relationship between internal consolidation [in agency data centers] and the cloud,” he said. “Because of that, they should be looking holistically at their portfolio of systems and services they offer to their users, what the business value and technical viability of those systems are, and then considering for each the alternatives of virtualizing and consolidating the data center or moving to the cloud.”

Agencies can use a variety of tools to measure things such as the peaks and valleys of server use, Smid said, and that information can then be plugged into modeling applications to show which servers are good candidates for virtualization.

“For correct consolidation, you need a phased approach based on a complete blueprint that has all of the right information available,” he said. “Without it, the reality is that with server virtualization, you’re as capable of developing as much virtual server sprawl as you originally with physical servers, which is the reason you are consolidating in the first place.”

About this Report

This special report was commissioned by the Content Solutions unit, an independent editorial arm of 1105 Government Information Group. Specific topics are chosen in response to interest from the vendor community; however, sponsors are not guaranteed content contribution or review of content before publication. For more information about 1105 Government Information Group Content Solutions, please email us at GIGCustomMedia@1105govinfo.com