Tougher sledding ahead for data-center consolidation
The low-hanging fruit. The early wins. The quick-return investments.
Whatever you call it, the relatively easy phase of the massive Federal Data Center Consolidation Initiative (FDCCI) is over. Many departments and agencies have rid themselves of the most obvious efficiency offenders: small data centers, facilities previously marked for closure and IT resources deemed surplus to mission requirements.
Since FDCCI was announced in February 2010, federal IT shops have closed about 20 percent of the data centers the initiative is seeking to eliminate by 2015. Some agencies are already reporting significant cost savings. But wringing more efficiency out of the government’s IT infrastructure calls for a greater level of effort. Therefore, the next round of consolidation will require more time and planning, with greater potential for significant operational changes, according to agency and industry executives.
The Census Bureau is already following the trajectory from straightforward to more difficult consolidation tasks. Last year, the bureau closed seven data centers as planned, including a secondary center that provided a surge capability for the decennial census. That move, in conjunction with server virtualization, will save the bureau $1 million, CIO Brian McGrath said.
Now, the bureau is tackling the challenge of virtualizing and consolidating more sophisticated systems.
“Up until last year, there was really a lot of low-hanging fruit,” McGrath said. “The pace begins to slow as we tackle more complex applications and systems.”
In addition to technical tasks, such as guiding systems along the consolidation path, federal IT managers face a number of organizational and cultural challenges. As agencies close data centers and move applications to centralized facilities, many of them are dealing with multitenant environments for the first time. And as consolidation continues, IT jobs will change and agencies will need to train employees to meet new challenges.
Indeed, as the consolidation effort progresses, some managers are rating the cultural challenges as more difficult than the technical ones.
“Technically, this initiative is fairly straightforward,” said Rob Wolborsky, chief technology officer at the Space and Naval Warfare Systems Command. “Much more profound are the changes associated with culture, policy and advocacy.”
The Navy plans to consolidate at least 58 data centers into the Navy Enterprise Data Center in the next five years.
“We have tremendous support from Navy leadership to get this done,” Wolborsky said. “However, this is a major change in the way data centers are doing business, and it will require a huge cultural shift as well.”
The next stage
In the first couple of years, FDCCI focused to a large extent on computing resources that all but begged to be consolidated.
“Most of these early wins have taken advantage of natural transition points — lease expirations, server refresh cycles — to drive consolidation decisions,” said Audrey Taylor, senior director at the Corporate Executive Board, an organization that provides research and advisory services to commercial and government entities.
Zach Baldwin, FDCCI project manager at the General Services Administration, said some of the early federal consolidation projects involved smaller data centers and “lift and shift” projects, which shuttle computing gear from one data center to another.
The initial consolidation effort has closed 250 data centers governmentwide so far, with a total of 479 slated to close by the end of fiscal 2012, Baldwin said. Overall, the federal initiative aims to close at least 40 percent of the 3,133 data centers in existence — or some 1,200 centers — by the end of fiscal 2015. The 3,133 total is a revised baseline; last year, the Obama administration expanded FDCCI to include data centers of any size, not just those occupying 500 square feet or more, as stated in the original directive.
Hitting the new target is where the hard work of consolidation comes in.
“I would estimate that we have only begun to scratch the surface of what needs to be done around data center consolidation,” said Nick Combs, CTO at EMC Federal. “It is easy to cut the first 10, 15, even 20 percent. We have taken the low-hanging fruit and now have the hard work to do.”
Sudhir Verma, vice president of consulting services at Force 3, a solutions provider that focuses on data centers and other IT areas, said agencies have reached a plateau of sorts. They tend to have the quick-return projects well in hand, but to get to the next level of consolidation, they must pursue the more arduous chore of assessing and virtualizing more mission-critical, tier-one applications. “That is really the foundation of the next wave of consolidation,” Verma said.
Indeed, Baldwin said agencies have shifted the consolidation focus to virtualization, rationalization and enterprise services, which he said take more time than traditional lift-and-shift projects.
“The things we are pushing for now...take some more time with planning,” he said. “But I think potentially those types of consolidation activities have more value in the long run and help push us to higher utilization and enterprise services.”
McGrath said officials at the Census Bureau “are looking at the other systems and applications that are a little more challenging to migrate” and, therefore, are seeking ways to virtualize applications such as geographic information systems, accounting, data analytics and custom data processing systems for economic indicators, McGrath said. The bureau is creating project plans with the goal of delivering those applications through virtual desktop infrastructure.
Taking the time to think through the value of applications before migrating them to a consolidated environment only makes sense, government and industry executives say. Although the assessment phase might slow the pace of consolidation, it saves money and limits complexity over the long term.
An agency closing a data center might opt to move systems and applications en masse to a centralized data center or a private cloud. The Agriculture Department’s Food Safety and Inspection Service (FSIS), however, ruled out the forklift method when it was time to migrate its IT.
“The cost would have been a lot more if we just picked it up and moved it,” said Janet Stevens, CIO at FSIS.
Instead, FSIS consolidated and virtualized its IT resources as much as possible before the move. That approach minimized the amount of space FSIS required in a partner’s data center and reduced the associated per-CPU costs. Similarly, rationalizing applications limits the roster of software that must be maintained.
In 2008, FSIS awarded a contract to Unisys to migrate its data center to USDA’s enterprise data centers. Stevens said most of the migration was completed by January 2011.
Jeff Bergeron, chief technologist at HP Enterprise Services’ U.S. Public Sector, said steps such as application rationalization can help agencies pare down their infrastructure.
“[Officials] need to take a hard look at their mission and objectives, their business processes and the supporting applications, and evaluate the system of systems that run their agency,” he said.
Such assessments identify duplicate, overlapping or obsolete applications that could be modernized or retired, Bergeron said. Potentially, three or more sets of infrastructure might be eliminated: development, test and production. The approach frees up energy, floor space, staff and other assets to improve agency operations in other areas, Bergeron added.
The consolidation struggle doesn’t end with sorting and migrating systems. Agencies will need to deal with the shift from dedicated, in-house computing resources to a shared-services, multitenant environment.
“The former...system owners who owned the infrastructure and systems and applications actually had those right next to them every day,” said Joseph Beal, IT security program manager at Creative Computing Solutions who works under a contract at the Homeland Security Department.
The challenge is that users and system owners have trouble understanding how government service providers will support the systems and controls they are used to having in place, Beal said. They also have trouble envisioning how centralized government IT organizations will provide the services they need.
“Culture is more of the challenge,” Stevens said. “Technology is easier — it is a lot more binary.”
On the cultural side, IT personnel will find themselves moving into spaces they don’t own and where they might need an escort to get to a server. They will also have questions about their work assignments in the new environment. Because the IT group is changing the way it does business, retraining staff or reorganizing the IT department might necessary, Stevens said. Organizations should also plan for turnover because some people might leave if their job responsibilities change, she added.
Furthermore, although consolidation turns some agencies into customers, it turns others into service providers.
When Census officials virtualized their primary data center and consolidated storage, the move freed floor space that the bureau now offers to other agencies. The International Trade Administration (ITA), part of the Commerce Department, migrated to the bureau’s data center last year. McGrath said he has provided cost information to a half-dozen other agencies that are factoring that information into their planning for consolidating their infrastructures.
The bureau has defined the services it offers in terms of floor space, power and cooling. McGrath noted that some agency clients have different cultures and operating procedures but added that Census enjoys many commonalities with ITA. For instance, the organizations go through the same security and badging process, so granting customers physical access to the bureau’s data center was easier than it would have been otherwise.
In general, however, shared services and multitenancy raise security issues for agencies. Beal said one concern is the ability of the service provider to implement controls to prevent cross-pollination of access privileges and exposure to attack vectors and vulnerabilities.
The shared environment also calls for attention to governance because a glitch in one application can interrupt a number of customers’ services and mission support. That’s why a mechanism should be in place to sort out potential conflicts, such as determining which agency customer goes back online first in the event of an outage.
“You really do need to have the right governance process and the right people at the table,” Stevens said. Likely participants include line-of-business managers, IT staff and service provider representatives.
Learning from industry’s experiences
Moving from consolidation to shared services is a fairly steep learning curve. Stevens recommended that agencies review Government Accountability Office reports to get a sense of what others have done.
In addition, Baldwin said the FDCCI task force holds monthly meetings that feature an agency presentation on a consolidation best practice or success story. Companies were asked to create white papers detailing industry best practices, and a library of those documents is now available to agencies.
Wolborsky said the Space and Naval Warfare Systems Command plans to benefit from industry’s experience as it moves forward on data center consolidation.
“We want to learn as much as we possibly can from industry — the most effective things done and their lessons learned to increase capability [and] security and reduce overall cost,” he said. “We expect a lot of what we do in the future will be based on what industry tells us.”
Consolidation progress report
Last year, the goals of the Federal Data Center Consolidation Initiative were revised to reflect the expanded definition of a data center. Here is where the effort now stands.
- Eliminate 40 percent of data centers — defined as 500 square feet and larger — by the end of fiscal 2015.
- Data center baseline: About 2,100.
- Consolidation closure target: at least 800 data centers.
- Eliminate at least 40 percent of data centers of any size by the end of fiscal 2015.
- Data center baseline: 3,133.
- Consolidation closure target: at least 1,200 data centers.
March 2012 quarterly update
- Actual closures in fiscal 2011: 198.
- Projected closures by the end of fiscal 2012: 479 (250 closed to date with 229 more planned).
- Projected closures by the end of fiscal 2015: 1,023.
Source: General Services Administration
Casting a wider net
The Obama administration has boosted its data center consolidation target from 800 to at least 1,200 based on a broader definition of a data center. Last year, the definition expanded from facilities of at least 500 square feet to essentially any space that houses a server.
The inclusion of smaller facilities should help the initiative meet its objective of streamlining little-used assets through virtualization and enterprise services. However, government and industry executives have differing views on how the new definition will affect agencies’ consolidation plans.
For agencies that have kept server sprawl to a minimum, the difference could prove minimal. Brian McGrath, the Census Bureau’s CIO, said the new terms haven’t had much of an impact on the bureau because it has maintained a fairly disciplined IT environment.
“We just don’t have...proliferations of closets with a couple of servers in them,” he said.
Zach Baldwin, Federal Data Center Consolidation Initiative project manager at the General Services Administration, said other agencies might have the same experience. “Counting these facilities will not change plans as much as it will give credit for work already being done by the agencies,” he said.
Dmitry Kagansky, chief technologist at Quest Software Public Sector, said budgetary concerns have had a greater impact than the revised definition of a data center.
“In fact, some government agencies were consolidating smaller data centers that didn’t fit the earlier definition, so the redefinition legitimized what they were already doing,” he said.
The potential payoffs
On the other hand, John Lambeth, senior vice president and CIO at QinetiQ North America, said the revised definition casts a wider net over mission-critical systems.
“This redefinition asks agencies to ensure that all mission-critical systems are assessed and moved to appropriate computing facilities,” he said. “This includes not only agency-owned data centers but also co-location space and small data rooms located within facilities.”
Agencies that still keep plenty of servers in closets will find themselves with more work to do. But at least that’s the simplest form of consolidation.
“This renewed focus on smaller closet-like data centers is giving agencies the ability to get some quick wins under the belt before moving to the larger mega data centers, which can take some time to consolidate,” said Susie Adams, chief technology officer at Microsoft Federal.
Rob Stein, vice president of federal civilian business at NetApp’s U.S. Public Sector, agreed. “Most data center consolidation to date by agencies has focused on physical consolidation, and even much of this was shutting down non-performing, non-optimal data center spaces and closets as opposed to professionally managed facilities,” he said.
“The real payoff from consolidation will come from consolidating IT infrastructure down to the shared server and storage level and fostering multitenancy so that there can be shared infrastructure among agencies and/or departments,” Stein said.
John Moore is a freelance writer based in Syracuse, N.Y.