Virtualization, cooling technologies cut agency data center energy costs

Agencies look to virtualization, cooling technologies to cut IT energy costs
 

Federal agencies are under a mandate expressed both through executive orders and law to cut energy use at their various facilities. Taking 2005 as the base year, each agency is required to reduce energy requirements by at least 30 percent by 2015. That puts IT squarely in the crosshairs because the energy used to power government IT equipment is one of the biggest energy costs for agencies.

The Office of Management and Budget made the point yet again when it launched the Obama administration’s Federal Data Center Consolidation Initiative in 2010. Without a fundamental shift in how technology is deployed, OMB said, energy use at federal data centers was set to double from 2006 levels, to more than 12 billion kWh in 2011.

The administration’s goal of closing some 800 government data centers by 2015 will put a big bite into that overall energy use. But with the average price per kWh increasing by some 10 percent a year, agency data center managers are still under pressure to come up with significant savings.

There are various ways they can do that.

Server virtualization
Servers in a data center can take up as much as half the total energy used in the facility, so they are an obvious target for savings. With virtualization, data centers can improve server utilization as much as four times over what they get for their physical servers, and the expectation is they could get similar energy savings from virtualization.

Real-world examples show the final numbers come in at much less. For example, Peter Sacco, president of consultant PTS Data Center Solutions, was recently faced with having to upgrade his company’s data center. He expected big energy savings using virtualization to shrink his physical server count from 14 to 5.

“All three of the manufacturers who told me about the benefits of virtualization severely overstated the case,” he said. “We thought it would be at least 50 percent, but it only produced 20 percent savings in the end, and that was a big surprise to us.”

The reason was, he said, that in order to get capable virtualized servers, you also need much beefier physical servers. You do get some additional savings because of the fewer resources you need to run the facility, but in total, the savings for the PTS virtualization came to just 25 percent.

Sacco believes those numbers will probably scale directly for much bigger data centers.

Cooling data centers
Cooling for data centers takes up much of the remaining energy needs, but the answers aren’t as simple as virtualizing servers. There are a number of elements that have to be considered in making the cooling plant as energy-efficient as possible.

For Sacco, the most important thing is to reduce air mixing to and from the air conditioner as much as possible. You need to get cold air to the server as quickly as possible and also get the heat coming off the server to the air conditioner as fast as possible.

“Anything you can do to make that happen improves efficiency,” he said.

Typically, that’s done by configuring server rack rows so that the hot air from the back of the servers goes to the air conditioner down one row, and the cool air from the air conditioner goes to the air inlet of the servers down another.

Data centers could also use air economizers wherever that’s practical, said Jay Pultz, an analyst at Gartner Research.

“It’s equivalent to opening a window and using the outside air as much as possible,” he said. “Depending on where you are in the country, using outside air can reduce power needs between 4 and 15 percent.”

Other techniques use the variable speed of modern air conditioners to match cooling needs at peak compute times, ratcheting down at other times. Putting a little attention into the layout of the data center can also allow for zoned cooling, so that most of it can be directed to those servers that work the hardest.

Monitor and measure
A survey of 157 federal IT executives by MeriTalk, reported in June 2011, found that two-thirds of them did not know what the average kilowatt-per-rack energy consumption was across their data centers. Fully 77 percent could not say what the power usage effectiveness, a basic measure of energy efficiency, was for their data centers.

There are several reasons for that, said Pultz. A big one is that data center managers traditionally have not had to pay the power bills for an agency, so knowing how much energy they were using hasn’t been necessary. Now, with the price of energy so high and data centers targeted as a fundamental “green” concern, they’ll increasingly need to know what energy they use.

Monitoring and measuring aren’t cheap. Deploying all the meters that are needed and then integrating them onto a single platform so that managers can easily analyze the energy use can cost well into six figures. But, without it, the bigger energy savings won’t be possible.

New servers for old
There’s a temptation in government agencies, particularly in a time of budget constraints, to stretch the use of legacy IT as much as possible. But that might not be the best strategy when it comes to energy efficiency.

Blade servers, for example, operate much more efficiently than 1U rackmount servers. As long as the chassis is filled with the blade servers, it will use less energy than an equivalent rack of 1U servers.

Likewise, newer servers are much more tolerant of higher temperatures than even three- or four-year-old machines, which means they can operate at higher inlet temperatures and therefore don’t need as much cooling. Every degree upward they can hike the data center’s baseline set point, agencies can save up to 3 percent in energy costs.

“I think we still have a way to go to recognize that the more modern equipment is way more resilient that we give it credit for,” said Sacco.

Besides the above strategies, data center managers should also be aware of the not-so-obvious. For example, Pultz said, the uninterruptible power supply (UPS) should not be considered just a backup power resource. In many data centers, it’s also used to “clean” the power supplied by utilities to remove power spikes and others anomalies before it’s provided to the servers and cooling plant.

“The efficiency of the UPS matters,” Pultz said. “If it’s only 80 percent, that’s not good. It needs to be at least 96 or 97 percent.”

Likewise, agencies should consider “rightsizing” data centers. Get away from building facilities for the demand expected 15 years down the road, which initially leaves a lot of empty space to power and cool. Instead, design it so they can supply power and cooling only to those parts of the data center that need it at any particular time.

That way agencies can save anywhere from 10 to 30 percent of the energy used with traditional data center design, Pultz said.

Is PUE the right metric to use?

Power usage effectiveness is the metric that the Obama administration has chosen to measure the progress of agencies in cutting back their data center energy demands. According to Executive Order 13514 (Federal Leadership in Environmental, Energy and Economic Performance) issued in 2009, half of agency data centers should be operating at a PUE of between 1.3 and 1.6 by fiscal 2012.

That means the ratio of the total power consumed by the facility to the power delivered directly to the servers should be no more than 1.6. In other words, if 1 watt goes to the servers, around half a watt should be spent cooling them and for other infrastructure such as lighting and power supplies.

PUE is also the basis of the government’s Energy Star for data centers program, which is aimed at helping managers assess how efficiently their data centers use energy. According to the Environmental Protection Agency, which developed the program, data center PUEs should be between 1.25 and 3.0.

An Energy Star award is coveted by agencies. In order to get the label, a government data center should perform better than 75 percent of all similar data centers nationwide.

As an article in Federal Computer Week in October 2010 pointed out, however, that Energy Star award might not mean much as far as how efficiently IT equipment is using energy. A data center could have a good PUE but still use more computers, storage and network devices to process a transaction — and therefore more energy — than another data center with a worse PUE.

Also, others have pointed out that gaming PUE is very simple. A higher IT load is all that’s needed to improve PUE, and that can be done simply by turning on old and inefficient servers that have been decommissioned.

Still, said Peter Sacco, president of consultant PTS Data Center Solutions, in the absence of anything, a little information is better than nothing.

“PUE is close enough,” he said. “It’s as good as you are going to get right now, and I would argue it actually provides more than a little. Like anything else, however, you need to understand what you’re looking at. After all, you can get a PUE of 1.0 if you don’t have air conditioning.”

 

About this Report

This special report was commissioned by the Content Solutions unit, an independent editorial arm of 1105 Government Information Group. Specific topics are chosen in response to interest from the vendor community; however, sponsors are not guaranteed content contribution or review of content before publication. For more information about 1105 Government Information Group Content Solutions, please email us at GIGCustomMedia@1105govinfo.com