Is Kundra's data center consolidation initiative doomed?
Doubts about the Federal Data Center Consolidation Initiative deserve a second look
- By John Zyskowski
- Nov 03, 2010
It was a good idea and no doubt launched with the best intentions, but it might be time to reconsider Vivek Kundra’s chances for success in reducing the number of federal data centers.
The obstacles ahead of the federal CIO’s Federal Data Center Consolidation Initiative (FDCCI) are numerous and growing: already-scarce funding at risk of further reduction, no hard consequences for lack of compliance, and a short timetable for developing plans that many agencies are struggling to meet.
On top of all that, it turns out the feds have nearly twice as many data centers as previously thought. A fresh inventory that wrapped up July 30 revealed that there are 2,094 federal data centers, not the 1,100 that Kundra cited when he launched the initiative in February. And it seems as though the government is building new data centers — from the National Security Agency’s massive $1.5 billion cybersecurity facility in Utah to a $1.1 million Census Bureau facility in Atlanta — as fast as it can pull the plug on older ones.
In September, market research firm Input issued an assessment of FDCCI. One of its conclusions is that consolidation could take years, possibly even a decade, to start realizing significant savings given the dispersed and nonstandard nature of many federal IT assets. In two years, the officials at the Office of Management and Budget who hatched the plan might not be around.
Given the hurdles, is FDCCI worth the effort? After all, many agencies have taken the first steps toward consolidation on their own by adopting server virtualization, which allows them to use fewer physical servers to handle the same workload.
However, some experts think the initiative will make a difference. Some even say the obstacles cited, though real, are not as daunting as they appear.
“It’s lit a fire under many agencies that may not have been doing something,” said Angie Petty, a principal analyst at Input and lead author of the firm’s FDCCI report. “I think it will stimulate more progress and in a faster manner than would occur organically.”
It will have to do so without a strong enforcement stick. The Obama administration is using the much softer stick of transparency by requiring agencies to report progress in areas such as server and space utilization, energy efficiency, and agency-specific consolidation goals. That management approach is consistent with the administration’s other IT policies, such as the federal IT Dashboard, Petty said.
But many of the federal IT executives Input interviewed expressed concern about FDCCI’s quick deadlines. “They said it was a little worrisome to put the plans together so rapidly,” Petty said.
Input analysts and other observers wonder whether the tight deadlines will restrict development. But David Fletcher, Utah's chief technology officer, said he thinks agencies don’t need a lot of time to weigh options.
“Solution options are well understood,” said Fletcher, whose state recently completed an 18-month consolidation project that reduced 37 data centers to two. “People have been thinking about this stuff for some time.”
He also said the lack of funding for federal consolidation is not the impediment that some make it out to be. “The return on investment, if you do it right, is so quick that it shouldn’t matter,” he said.
Utah spent some money upfront on its consolidation effort, but the return on investment came in less than six months through savings in equipment, power and personnel, he added.
Fletcher said the geographic proximity of Utah’s data centers made consolidation a relatively easy job compared to a federal agency with IT operations spread throughout the country. But he also said the grander scale of feds’ operations means more opportunity for savings.
FDCCI skeptics point out that in 1995, the Clinton administration launched an unsuccessful effort to reduce data centers from 200 at that time to 50. However, those earlier would-be consolidators faced a much more challenging technical environment, according to Input. Existing systems were more proprietary, and officials at the time didn’t have the robust storage, network, server virtualization and asset management tools that IT executives have today.
Critics also blamed a government culture that resists ceding control for derailing the earlier effort. Apparently, not much has changed. In a survey by NetApp and MeriTalk released in May, 86 percent of the federal IT professionals surveyed said culture is the No. 1 obstacle to the latest consolidation initiative.
John Zyskowski is a senior editor of Federal Computer Week. Follow him on Twitter: @ZyskowskiWriter.