The new diet plan: Virtualization, then major consolidation
Most organizations will require new tools and techniques to reap the full benefits
- By Jennifer x_Zaino
- Jun 18, 2007
One of the largest virtualization deployments in the government is at the Defense Contract Management Agency, which has nearly 650 virtual machines residing on about 60 physical computer servers. DCMA officials credit virtualization with helping them consolidate 18 data centers into five and realize a host of other benefits.
Virtualization software divides a physical server into isolated virtual environments, enabling organizations to run multiple applications or operating systems on a single server. Such consolidation helps agencies reduce the number of physical servers they must maintain and creates opportunities to streamline many administrative tasks.
But as most agencies soon learn, the flexibility and efficiency that virtualization brings also change the server management paradigm. Virtualization requires new tools to support the new infrastructures, new processes to govern them and new employee skills.
“This is one of the biggest traps that enterprises fall into — thinking their tools to manage physical systems are good enough to manage virtual systems,” said Andi Mann, senior analyst at Enterprise Management Associates. “In most cases, that’s not true.”
Companies offer an increasing number of tools for managing virtual environments and a few provide hybrid tools that manage limited aspects of both environments, physical and virtual, Mann said. However, no one offers a single toolset that organizations can use to support all their management disciplines across physical and virtual infrastructures. And if such tools were available, virtualization experts say, organizations that used only one set of techniques to manage both environments could miss many of the benefits that virtualization brings.
For example, in physical server environments in which files and applications run on the same piece of hardware, organizations generally wind up including whole disk drives when they back up each server, because it’s too cumbersome to run separate backup routines for different applications on a single server.
Backup utilities designed to support virtual server environments can provide organizations with a single console for slicing and dicing backup requirements for their multiple virtual machines. Those utilities make it easier to perform more frequent backups of the most active applications, such as databases, while delegating applications with more static data to less frequent schedules.
“You don’t use as much tape or disk space on the storage side for the backup because you can more easily segment what you want to back up and recover,” said Ed Harnish, vice president of marketing at data backup vendor Acronis.
At DCMA, server virtualization supported by specialized backup software has helped the agency significantly improve its backup and disaster recovery capabilities.
“It has opened up some doors for us in disaster recovery and continuity-of-operations planning that didn’t exist before,” said Peter Amstutz, chief of the Technical Requirements and Design Division at DCMA Information Technology.
Now the agency can replicate a server at an alternate location and restart it in the event of a disaster.
DCMA uses Vizioncore’s esxRanger Professional in its arsenal of backup tools. The Vizioncore software is designed to work with DCMA’s virtualization platform, VMware’s VMware Virtual Infrastructure 3.Physical meets virtual
Before organizations can get into the nitty-gritty of managing their virtual environments, they have to create them, and that requires calculating capacity needs.
“Now that you have a situation where you share the same physical resources among multiple virtual machines, it means a lot of organizations have to relearn the lost art of capacity planning,” said Info-Tech senior research analyst John Sloan. That means calculating whether enough headroom exists in CPU, memory, storage and network bandwidth to accommodate the multiple virtual machines that will reside on a single server.
When DCMA made a commitment to virtualization, it concentrated on specifications for CPU and memory utilization and network and disk input/output requirements for its most CPU- and memory-intensive servers. Using reports generated in Concord’s — now CA’s — eHealth network-monitoring tool, managers extracted a year’s worth of detailed performance data. They made their physical-to-virtual-server transition plans with peak workloads in mind.
“If you design on the average load, you will run into problems during spikes,” Amstutz said. DCMA then used VMware’s P2V Converter utility to move the workloads from physical to virtual machines.
Besides accounting for workload compatibility, organizations must also consider operational details such as scheduled maintenance windows for the virtual machines.
“If maintenance windows don’t overlap, you can never do hardware maintenance again” because the physical server is constantly in use, said Andrew Hillier, chief technology officer and co-founder of CiRBA, which makes software that helps organizations analyze and develop road maps for consolidated and virtualized data centers.
At the Food and Drug Administration, efforts to plan and move from a physical to a virtualized environment based on VMware ESX software have paid off. The agency has nearly 100 virtual machines running on nine physical servers. The virtual machines run a variety of database programs, e-mail, document-management applications and custom programs.
FDA used PlateSpin PowerRecon to develop a workload profile of all its servers, collecting data on CPU, memory and disk input/output.
The objective was to see how workloads add up in the ESX environment, said Michael Voss of Booz Allen Hamilton, who serves as VMware technical lead at FDA’s Office of IT Shared Services.
“As far as right-sizing, it’s been pretty on,” he said.
What has been more difficult is making sure that the administrators assigned to perform ongoing maintenance of the virtual machines don’t change the initial setup. The virtualization project, which began in FDA’s infrastructure group, is now sponsored by the Office of the Chief Information Officer.
“We virtualize something, and then the customer adds another database [for testing or development] or installs several applications,” and suddenly the resources are no longer appropriate for that machine, Voss said.
His team usually finds out about the add-ons when performance degrades, at which point it works with the agency to rematch application needs with system resources. The rebalancing can be done quickly.
“With a virtual machine, you can add disk on the fly, shut down a virtual machine, and increase CPU or memory or other resources in a matter of minutes,” Voss said.
A related benefit of virtualization is that applications don’t necessarily have to stay on a particular virtual machine on a specific hardware host.
For example, organizations might move a virtual machine running an accounting application from a two-CPU to a four-CPU server for a week’s time at the end of every month to give it more power when they’re closing the books or even move it to its own dedicated physical machine.
“That’s a different management problem than you ever had,” said Jim Sweeney, enterprise solutions consultant at GTSI.
With a utility like VMware’s Vmotion, administrators can move a virtual machine instantaneously from one physical server to another. Third-party vendors such as Acronis and PlateSpin provide anywhere-to-anywhere workload portability, including virtual-to-physical portability and portability across different virtual operating systems. Cookie-cutter configurations
Another perk of virtualization is the ability to create application and system templates that can be deployed quickly, reducing the time and effort involved in configuring and provisioning new servers.
“You don’t need to rebuild it, you just pull an image down,” Sweeney said. “A Web server is a template, so every time you go to the systems administrator and say you need a new Web server, it’s provisioned from that.”
Mariano Pernigotti, a senior systems engineer at engineering services firm CSSI, sees benefits in creating application templates for the SWSoft Virtuozzo virtualized server environment he oversees at the Federal Aviation Administration’s Air Traffic Control System Command Center.
Pernigotti helps support the National Operations Group, the arm of the FAA that operates the National Operations Coordination Center. The center tracks and reports National Airspace System equipment-related events.
For instance, the agency wants its virtualized Web-based application servers to make use of Webtrends software for tracking Web site usage statistics.
With Virtuozzo, every physical machine runs a single virtualized Microsoft Windows operating system. Each virtual Web server can borrow the application template that streams the Webtrends software into the virtual environment.
“As a result of not having to install Webtrends in the traditional sense on every virtual environment, the image size of each virtual machine is much smaller, so backups are much smaller…and smaller equals faster,” Pernigotti said.
FDA administrators also find value in templates. “The advantage is it’s a prebuilt virtual machine that adheres to all your security standards,” Voss said.
Some organizations, however, have reservations about the use of templates. DCMA officials, for instance, consider them an administrative burden that adds to patching and other maintenance chores, and so it uses them only rarely, Amstutz said. Coming together
It might be tempting to assume that the nimbleness of virtualized servers will translate into lower administration and operations costs, but analysts caution against such expectations.
“Don’t look for savings in terms of managing the individual servers,” Sloan said.
However, software management vendors are working on products to help organizations streamline their management routines, which in turn might reduce management burdens and costs.
Opsware officials, for instance, say they can tackle data-center automation for the physical and virtual worlds, provide users with a single view into managing both environments, perform configuration management actions on either, view virtual server dependencies, and track host and guest relationships across heterogeneous virtual environments.
“What data-center automation is about is enabling you to manage all of that in one place, in a cohesive way, versus [managing] a bunch of point devices,” said Eric Vishria, vice president of marketing at Opsware.
VMware officials are working on interoperability and integration, providing a set of application program interfaces and a software development kit that lets other management vendors connect to VMware’s database and management server and pull information.
“All this is relatively new, and various vendors are in various stages of their evolution,”said Raghu Raghuram, vice president of product and solutions marketing at VMware.
Microsoft is on its own track to building a management platform that doesn’t care if organizations have physical or virtual environments. It lets administrators manage both from a single console with tools such as Microsoft Operations Manager 2007 and the upcoming Configuration Manager.
“What is becoming very apparent to IT as they virtualize the infrastructure is that management is going to be the key piece in this whole game,” said David Greschler, director of virtualization strategy at Microsoft.
Sloan said management capabilities will be getting a lot more attention as the market for virtualization options becomes more competitive. “It will come down to not who gives you virtualization, but who gives you the most comprehensive management suite for your infrastructure, including virtual machines.” Zaino has been covering business technology issues for the industry’s leading publications since 1986. She can be reached at