Data Centers

Does counting data centers miss the point?

Steve VanRoekel 102012

Closing data centers is a means to an end, not the end itself, says U.S. CIO Steven VanRoekel. (FCW photo)

One of the key measurements of the Federal Data Center Consolidation Initiative is how many data centers have been closed. But focusing on that as if it were the end in itself is a mistake, said U.S. CIO Steven VanRoekel.

The closures are more a means to an end, a necessary first step toward optimizing the federal government’s IT systems, he said -- an argument echoed by others in both industry and government.

VanRoekel was one of several federal officials who discussed data center consolidation at a Jan. 22 hearing before the House Committee on Oversight and Government Reform. (Read more FCW coverage of the hearing here.)

At the hearing, Rep. Carolyn Maloney (D-N.Y.) suggested that the data centers were put in place originally for good reasons, and asked how the government decided which ones to close -- or even whether they should be closed at all.

The question gave VanRoekel the chance to explain that closing a data center means moving its information elsewhere, not losing it. The overall effect is to retain the data, but lower the cost of storing and using it.

"In essence, we are going to optimize and close data centers by shifting the resources of one to other ones – to more efficient data centers – not taking away certain services, not deprecating any service we provide," VanRoekel said. "And if anything, while we make that shift, we modernize those systems and provide an even better service. It’s a really nice opportunity to build efficiency and effectiveness at a much lower cost."

Agencies are attempting to realize long-term savings in different ways, with each creating their own plans of action, according to David Powner, director of IT management issues at GAO.

In theory, when an agency closes a data center, it will realize savings from reduced energy consumption, eliminated facility costs and reduced labor, but a big chunk of potential savings depends upon how the agency optimizes its existing IT infrastructure.

Getting rid of a few obsolete servers that run singular applications can certainly save money, but if an agency is simply moving old, inefficient applications and systems to another facility, they’re losing big savings in the long run, said Rob Stein, NetApp’s public sector vice president. The key is to take the opportunity to make real changes in strategy.

"I think the cost benefits are huge if agencies go to a shared environment where multiple tenant applications use the same infrastructure, server, storage, networking and software in a secure manner," he said. "That way, they are reducing their real estate costs by closing centers and reducing the hardware and software infrastructure they need, too."

Mark Forman, former administrator for e-government and IT at the Office of Management and Budget, has expressed doubts about agencies’ willingness to "consolidate away the complexity" of client/server applications.

But Stein said some agencies are doing just that and saving large amounts of money in the process, considering data storage and management is one of the biggest expenses in the federal government’s $80 billion IT budget.

The National Institutes of Health, the Federal Aviation Administration, the Transportation Department's Enterprise Services Center and the State Department have cut data storage costs as much as 80 percent using NetApp’s software, Stein said. 

EMC, IBM, HP and Oracle provide agencies with similar products.

Added benefits of the shared environment are non-disruptive upgrades, increased scalability and easy integration with numerous software and hardware platforms. As Stein put it, technology is out there that allows "agencies to do a lot more with a much smaller chunk of real estate."

About the Author

Frank Konkel is a former staff writer for FCW.

FCW in Print

In the latest issue: Looking back on three decades of big stories in federal IT.


  • Anne Rung -- Commerce Department Photo

    Exit interview with Anne Rung

    The government's departing top acquisition official said she leaves behind a solid foundation on which to build more effective and efficient federal IT.

  • Charles Phalen

    Administration appoints first head of NBIB

    The National Background Investigations Bureau announced the appointment of its first director as the agency prepares to take over processing government background checks.

  • Sen. James Lankford (R-Okla.)

    Senator: Rigid hiring process pushes millennials from federal work

    Sen. James Lankford (R-Okla.) said agencies are missing out on younger workers because of the government's rigidity, particularly its protracted hiring process.

  • FCW @ 30 GPS

    FCW @ 30

    Since 1987, FCW has covered it all -- the major contracts, the disruptive technologies, the picayune scandals and the many, many people who make federal IT function. Here's a look back at six of the most significant stories.

  • Shutterstock image.

    A 'minibus' appropriations package could be in the cards

    A short-term funding bill is expected by Sept. 30 to keep the federal government operating through early December, but after that the options get more complicated.

  • Defense Secretary Ash Carter speaks at the TechCrunch Disrupt conference in San Francisco

    DOD launches new tech hub in Austin

    The DOD is opening a new Defense Innovation Unit Experimental office in Austin, Texas, while Congress debates legislation that could defund DIUx.

Reader comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above

More from 1105 Public Sector Media Group