Is the containerized data center an answer?

Improving data center efficiency usually means upgrading a lot of the server, storage, power and cooling systems with newer technology that is both more energy-efficient than older technology and is smaller, which helps in meeting density requirements. But that costs money, time and labor so, why bother? Why not just buy a new, efficient data center in one go?

That’s the idea behind containerized data centers, a growing part of the overall data center business. They are literally data centers housed in shipping containers, with servers, power supplies and cooling systems delivered as single unit. Just plug them in and go.

NASA was one of the first government agencies to use containerized systems, using them to help build its Nebula cloud computing platform, which provides compute and storage capacity on demand for NASA scientists and researchers. The containers enabled the agency to set up the data center infrastructure needed for Nebula in a fraction of the time and at much lower cost than it would have needed to actually build new data centers.

Each container holds up to 15,000 CPU cores and some 15 petabytes of storage. Built on green energy principles, they were rated as some 50 percent more efficient than traditional, brick-and-mortar data centers when they were rolled into the NASA Ames Research Center site in 2009. They also helped cut the average data center planning cycle from an average of around two years to just 120 days.

NASA Goddard also is planning to install a containerized data center as one of the three that will remain after it has consolidated down from its previous 13 data centers.

The Army is another government organization that has also recently opted for containerized data centers, as part of its five-year, $250 million Area Processing Centers Army Private Cloud (APC2) initiative, for which it made awards in January of 2012.

The Army plans to close around 185 of its data center by the end of 2015, as part of the Federal Data Center Consolidation Initiative (FDCCI), but also has a moratorium in place against the building of any new data centers, so the containers provide a ready answer for new data center capacity for its private cloud initiative.

However, the verdict is still out on the ultimate effectiveness of containerized data centers. You still have to find enough space to put the containers on site, provide security for them, and also hook up power and chilled water supplies.

Modern server technology provides modular and relatively cheap ways to upgrade existing data centers. Blade servers allow for high performance compute power in less space than traditional rack mounted servers, use fewer power supplies and need less fans and cooling. Intelligent power distribution units allow data center managers to know exactly where the power is going, how much is being used and even how hot servers are running.

When you look at the technology behind containerized data centers, it’s based on virtualization with a management layer on top of that, said Faisal Iqbal, manager for systems engineering in Citrix Systems’ public sector. That’s pretty much the same approach that agencies are using to modernize their existing data centers.

Containerized data centers definitely have some direct application to tactical scenarios in the military, he said, and even some civilian agencies that want to drive down physical maintenance costs in their data centers.

“But you can derive the same kind of efficiencies with the data centers you own today, if you do things correctly,” he said. “There’s no reason why you can’t make them as efficient as these containerized data centers.”

About this report

This report was commissioned by the Content Solutions unit, an independent editorial arm of 1105 Government Information Group. Specific topics are chosen in response to interest from the vendor community; however, sponsors are not guaranteed content contribution or review of content before publication. For more information about 1105 Government Information Group Content Solutions, please e-mail us at [email protected].