How to save energy and money through data center consolidation

NASA has discovered the benefits of two critical elements of data virtualization: power management and virtualization, which go hand in hand.

The Obama administration has challenged federal agencies to improve efficiency in their data centers, but in many ways, the deck seems stacked against them.

Federal CIO Vivek Kundra started the Federal Data Center Consolidation Initiative in February. FDCCI seeks to reduce energy use, spur IT cost decreases and improve security.

However, agencies face a number of obstacles to consolidation: a lack of upfront funding, technical obstacles, unrealistic timelines, and cultural and political problems. As a result, data center consolidation could take a decade to achieve, according to an Input report titled “Assessment of the 2010 Federal Data Center Consolidation Initiative.”

Two-thirds of the data center managers and CIO office executives at federal agencies interviewed by Input said “the biggest obstacle was lack of upfront funding,” said Angie Petty, a principal analyst at Input’s Federal Industry Analysis program and a co-author of the report.

“Unfunded mandates have been cited as the downfall of the 1995 federal data center consolidation initiative, and this time around, the White House has again chosen not to set aside additional funding for data center consolidation efforts,” the report states.

The lack of funding limits the actions that agency IT managers can take. So agencies are using technology refresh cycles to get around that obstacle, Petty said.

However, consolidation is not the only way to achieve data center efficiency, said Chris Kemp, NASA’s chief technology officer for IT, a new position created in May.


Related stories:

Culture, management issues continue to affect consolidation efforts

Army steps up data center consolidation after imposing server moratorium

Agencies trying to wring more value from virtualization


Managing energy use is another way to optimize data center performance and reduce costs. Kemp is leading efforts to measure power consumption across NASA’s facilities and infrastructure.

If the agency can more closely monitor power consumption by its IT and facilities equipment, managers can make more informed decisions about which systems are candidates for consolidation and virtualization, Kemp said. That knowledge can help the CIO develop a more viable data center consolidation plan, Kemp added.

Agencies had to submit data center consolidation plans that align with fiscal 2012 budget requests to the Office of Management and Budget by Aug. 30. OMB is reviewing those plans, with the goal that work on executing those plans will start in 2011.

In the spring, NASA halted plans to build a $1.5 billion data center because of new OMB guidance on cloud computing, federal data center consolidation, green IT and virtualization. Agency officials said they would develop a data center consolidation plan that included a data center architecture and full assessment of its facilities.

Most large-scale data center consolidation initiatives are already under way or at least in the planning stages, according to Input. “However, the lack of funding and adequate planning time and resources for FDCCI present major obstacles to it being the massive, game-changing program that it is intended to be,” the report states.

Instead, FDCCI will stimulate agencies to combine smaller server rooms and closets, look to make existing larger data centers more efficient, and accelerate the adoption of solutions such as virtualization and cloud computing, the report states.

In the end, those actions will generate cost savings, or at least stabilization of expenses, in light of rising computing demand from federal users and increasing energy costs, according to the report.

Taking Stock

“You can’t manage what you can’t measure,” Kemp said, explaining why NASA is trying to measure the consumption of power across the agency.

But that is difficult, because, like many large agencies, NASA has every type of IT infrastructure that ever existed. It is easy to aggregate data if an organization has the same kinds of servers, uninterruptible power supply systems and equipment across the enterprise.

“But if you have 10 different large facilities with over 100 data centers of different sizes spread across the agency,” being able to see all that information and make decisions about how to optimize that environment is a daunting task, Kemp noted.

To achieve that, the agency is working with Power Assure, a data center optimization company.

“What is unique about Power Assure is they offer this any-to-any hardware to software integration,” Kemp said. That lets data center managers use their existing investment in meters and measuring devices. For example, if existing uninterruptible power supply devices have serial ports, data managers can aggregate data feeds from hundreds of different manufacturers, existing protocols, facilities protocols and IT protocols into a single view on a dashboard.

As a result, managers have an aggregated view of their IT and facilities infrastructure, Kemp said.

Power Assure provides a software-as-a-service model, so no software is deployed on users’ sites, said Power Assure CTO Clemens Pfeiffer. Instead, users install an appliance on the network that collects information from devices and sends it back to an analytical engine in Power Assure’s data center.

The dynamic power optimization software includes advanced automation modules that support server, system management, inventory and building management platforms. The software also has a virtualization manager that extends power management visibility to Citrix, IBM’s PowerVM, VMware and the Xen open-source virtualization platform.

Dynamic power management software lets users perform what-if analysis for capacity, performance and power. It also provides application integration and synchronization with major system, network, inventory, and building management solutions. Users can view power and utilization information via a Web portal.

Computers consume power and generate heat, and Kemp said chillers are running to cool the infrastructure. Modern data centers can control those chillers, but retrofitted or older data centers don’t have the instrumentation in place. However, Power Assure software allows NASA to aggregate power use information into a single view, Kemp said.

NASA officials are assessing all their data centers as potential targets for consolidation or virtualization. “We are assessing the metering and monitoring on those facilities, and our intent is to complete the assessment of all our data centers by the end of this year,” Kemp said. The agency plans to implement metering and monitoring of the facilities by the first half of 2011.

“This is part of our plan when OMB asked us what we are doing to respond to the call to increase the efficiency of our infrastructure,” Kemp said.

“If you had an infinite amount of money to throw at the problem, you would virtualize everything and move everything to the cloud or to a cloud hosting provider,” Kemp said.

“The reality is that this is a process that will take a couple of years," he said. "And we need to identify where opportunity exists to achieve the maximum savings.” NASA officials hope they can pay for a lot of the modernization through the initial cost savings.

Virtual Punching Power

Tackling energy consumption is an idea that is now unavoidable, said Pat Grow, federal field solutions architect at CDW Government.

Consolidation has become critical as data centers reach their capacity. However, 64 percent of government users surveyed for a forthcoming CDW-G report on data center consolidation said they want to reduce their energy consumption. They have been required by the present and previous administration to reduce their power footprint, Grow said.

Government agencies have extended their IT refresh cycles because hardware is getting better. However, it has only been in the past couple of years that technology has become more efficient. Organizations have reached a point at which the sprawl in data centers has taxed power and cooling systems.

Government users have said, “We can’t add anything until we add more cooling,” he said. If they add more cooling, they’re taking cooling cycles and converting them into watts. They can’t add more watts until they take care of cooling. “So we are finding the biggest thing leading into consolidation is to be able to virtualize servers and storage. That gets them the biggest punch right up front,” Grow said.

Virtualization broadly refers to consolidating several systems onto one physical machine while maintaining their separate identities.

Whenever Grow visits an organization’s data center and sees old hardware, he says “these are going to be the felons of power consumption.” In many cases, someone is running a report on an old server once a quarter, but the organization hasn’t found a way to get rid of the box.

For the most recent CDW-G virtualization report, Grow said that of 100 servers surveyed, 20 were not in use. However, they were still being patched and were still consuming power.

Agencies are finding that if they can consolidate through processes such as virtualization, they can get a better handle on what they have. An additional virtual server essentially adds no power load or external heat load to the data center, Grow said.

State governments also are looking to save money and energy through data center consolidation. Virtualization is the cornerstone or at least plays a big role in those efforts.

New Mexico’s Department of Information Technology (NM DOIT) will offer infrastructure as a service to other agencies in the state through a private cloud by the end of the year. Three years ago, NM DOIT embarked on a $6 million project to upgrade its 35-year old data center.

The center had reached capacity, and adding servers wasn't an option. The center had maxed out its cooling capacity, and data center managers could barely keep the data center cool, said Michael Martinez, director of enterprise operations at NM DOIT.

So IT officials embarked on a project to consolidate all the homegrown data centers — wiring closets or server rooms — that had some cooling and power units in agencies across the northern part of the state. 

The infrastructure-as-a service model will help New Mexico dramatically reduce IT expenses and electricity costs because agencies will not need to buy their own IT hardware, which consumes a lot of electrical power. In some cases, agencies will move their physical servers to the Santa Fe-based facility. However, NM DOIT officials want agencies to migrate their applications to virtual servers inside the data center.

The cornerstone of the infrastructure is Cisco Systems’ Unified Computing System. UCS integrates compute, network, storage access and virtualization resources in a single energy-efficient system geared to reduce IT infrastructure costs and complexity. The private cloud also consists of new storage units and VMware virtualization software.

“We have a large number of servers already in the data center," Martinez said. "So the first item was to consolidate our own.” For example, DOIT's old e-mail system, Microsoft Exchange 2003, resides in seven racks. The department will consolidate the racks — completely filled from floor to ceiling with servers or storage — to a single chassis with a new storage unit added to it.
 
The organization will move from seven racks to one-and-a-quarter racks, which will save floor space. Plus, there will be a major drop in power consumption because the old e-mail boxes — approximately 50 servers — cost $43,000 per year in electricity to run. “We’re estimating that the new boxes will cost about $6,000 a year with the new chassis,” he said.

Martinez said he expects the department to complete the work by December. NM DOIT plans to move to a Microsoft Exchange 2010 e-mail system.

Consolidation’s Power Surge

A lot of the data center consolidation plans submitted by federal agencies in August are at a high conceptual level, said Jay Owen, a vice president at Schneider Electric, who based his assessment on feedback from government users and members of the CIO Council.

In those plans, many agencies did not outline what they need to do to their facilities to accomplish their consolidation goals, he said. The message Schneider Electric is trying to convey to government agencies is that consolidation will drive higher densities in the existing or new facilities they consolidate into.

The amount of computing power that fits into a rack is much higher after consolidation than it is in an existing environment, which leads to a number of power and cooling challenges, Owen said. “Most data centers cannot handle the densities that you typically see after a consolidation project without significant upgrades,” Owen said.

Data center managers need a strategy to overcome those power and cooling challenges, he said. Schneider Electric offers assessment services to help organizations better understand their environments and systems that can handle the high-density challenges. For example, organizations can gain a lot more efficiency in a data center by moving cooling equipment closer to the heat source. Schneider Electric offers InRow cooling systems that look like IT racks that sit along side IT equipment, Owen noted.

Step by Step

Vick Vaishnavi, vice president of worldwide marketing at BMC Software, offers agencies a five-step plan for successful data center consolidation. The first step includes devising a solid plan that deals with the business case for the consolidation and the people skills needed to take care of the infrastructure, he said.

Second, agency IT managers need a good discovery tool to view every asset in their environment so they can determine the dependencies of every physical component. Next, managers should identify the best candidates for consolidation. The fourth step is to move to repurpose and reconfigure systems, such as moving from physical to virtual servers. The fifth step is for managers to evaluate how well they have met their goals.

Consolidation starts with asset identification and control, said Dan Kent, director of federal solutions at Cisco. Then there is a need to standardize software licenses and services of core contracts. Then agencies can start with true virtualization with the end-game of optimizing the data center, Kent said.

The first phase of the next-generation data center is a private cloud in an automated environment, he said.