The new diet plan: Virtualization, then major consolidation

Most organizations will require new tools and techniques to reap the full benefits.

One of the largest virtualization deployments in the government is at the Defense Contract Management Agency, which has nearly 650 virtual machines residing on about 60 physical computer servers. DCMA officials credit virtualization with helping them consolidate 18 data centers into five and realize a host of other benefits. Virtualization software divides a physical server into isolated virtual environments, enabling organizations to run multiple applications or operating systems on a single server. Such consolidation helps agencies reduce the number of physical servers they must maintain and creates opportunities to streamline many administrative tasks.But as most agencies soon learn, the flexibility and efficiency that virtualization brings also change the server management paradigm. Virtualization requires new tools to support the new infrastructures, new processes to govern them and new employee skills. “This is one of the biggest traps that enterprises fall into — thinking their tools to manage physical systems are good enough to manage virtual systems,” said Andi Mann, senior analyst at Enterprise Management Associates. “In most cases, that’s not true.” Companies offer an increasing number of tools for managing virtual environments and a few provide hybrid tools that manage limited aspects of both environments, physical and virtual, Mann said. However, no one offers a single toolset that organizations can use to support all their management disciplines across physical and virtual infrastructures. And if such tools were available, virtualization experts say, organizations that used only one set of techniques to manage both environments could miss many of the benefits that virtualization brings.For example, in physical server environments in which files and applications run on the same piece of hardware, organizations generally wind up including whole disk drives when they back up each server, because it’s too cumbersome to run separate backup routines for different applications on a single server.Backup utilities designed to support virtual server environments can provide organizations with a single console for slicing and dicing backup requirements for their multiple virtual machines. Those utilities make it easier to perform more frequent backups of the most active applications, such as databases, while delegating applications with more static data to less frequent schedules. “You don’t use as much tape or disk space on the storage side for the backup because you can more easily segment what you want to back up and recover,” said Ed Harnish, vice president of marketing at data backup vendor Acronis.At DCMA, server virtualization supported by specialized backup software has helped the agency significantly improve its backup and disaster recovery capabilities.“It has opened up some doors for us in disaster recovery and continuity-of-operations planning that didn’t exist before,” said Peter Amstutz, chief of the Technical Requirements and Design Division at DCMA Information Technology. Now the agency can replicate a server at an alternate location and restart it in the event of a disaster. DCMA uses Vizioncore’s esxRanger Professional in its arsenal of backup tools. The Vizioncore software is designed to work with DCMA’s virtualization platform, VMware’s VMware Virtual Infrastructure 3.Before organizations can get into the nitty-gritty of managing their virtual environments, they have to create them, and that requires calculating capacity needs. “Now that you have a situation where you share the same physical resources among multiple virtual machines, it means a lot of organizations have to relearn the lost art of capacity planning,” said Info-Tech senior research analyst John Sloan. That means calculating whether enough headroom exists in CPU, memory, storage and network bandwidth to accommodate the multiple virtual machines that will reside on a single server. When DCMA made a commitment to virtualization, it concentrated on specifications for CPU and memory utilization and network and disk input/output requirements for its most CPU- and memory-intensive servers. Using reports generated in Concord’s — now CA’s — eHealth network-monitoring tool, managers extracted a year’s worth of detailed performance data. They made their physical-to-virtual-server transition plans with peak workloads in mind. “If you design on the average load, you will run into problems during spikes,” Amstutz said. DCMA then used VMware’s P2V Converter utility to move the workloads from physical to virtual machines.    Besides accounting for workload compatibility, organizations must also consider operational details such as scheduled maintenance windows for the virtual machines. “If maintenance windows don’t overlap, you can never do hardware maintenance again” because the physical server is constantly in use, said Andrew Hillier, chief technology officer and co-founder of CiRBA, which makes software that helps organizations analyze and develop road maps for consolidated and virtualized data centers.   At the Food and Drug Administration, efforts to plan and move from a physical to a virtualized environment based on VMware ESX software have paid off. The agency has nearly 100 virtual machines running on nine physical servers. The virtual machines run a variety of database programs, e-mail, document-management applications and custom programs. FDA used PlateSpin PowerRecon to develop a workload profile of all its servers, collecting data on CPU, memory and disk input/output. The objective was to see how workloads add up in the ESX environment, said Michael Voss of Booz Allen Hamilton, who serves as VMware technical lead at FDA’s Office of IT Shared Services. “As far as right-sizing, it’s been pretty on,” he said.   What has been more difficult is making sure that the administrators assigned to perform ongoing maintenance of the virtual machines don’t change the initial setup. The virtualization project, which began in FDA’s infrastructure group, is now sponsored by the Office of the Chief Information Officer.  “We virtualize something, and then the customer adds another database [for testing or development] or installs several applications,” and suddenly the resources are no longer appropriate for that machine, Voss said. His team usually finds out about the add-ons when performance degrades, at which point it works with the agency to rematch application needs with system resources. The rebalancing can be done quickly.“With a virtual machine, you can add disk on the fly, shut down a virtual machine, and increase CPU or memory or other resources in a matter of minutes,” Voss said. A related benefit of virtualization is that applications don’t necessarily have to stay on a particular virtual machine on a specific hardware host. For example, organizations might move a virtual machine running an accounting application from a two-CPU to a four-CPU server for a week’s time at the end of every month to give it more power when they’re closing the books or even move it to its own dedicated physical machine. “That’s a different management problem than you ever had,” said Jim Sweeney, enterprise solutions consultant at GTSI. With a utility like VMware’s Vmotion, administrators can move a virtual machine instantaneously from one physical server to another. Third-party vendors such as Acronis and PlateSpin provide anywhere-to-anywhere workload portability, including virtual-to-physical portability and portability across different virtual operating systems.   Another perk of virtualization is the ability to create application and system templates that can be deployed quickly, reducing the time and effort involved in configuring and provisioning new servers.“You don’t need to rebuild it, you just pull an image down,” Sweeney said. “A Web server is a template, so every time you go to the systems administrator and say you need a new Web server, it’s provisioned from that.” Mariano Pernigotti, a senior systems engineer at engineering services firm CSSI, sees benefits in creating application templates for the SWSoft Virtuozzo virtualized server environment he oversees at the Federal Aviation Administration’s Air Traffic Control System Command Center. Pernigotti helps support the National Operations Group, the arm of the FAA that operates the National Operations Coordination Center. The center tracks and reports National Airspace System equipment-related events.  For instance, the agency wants its virtualized Web-based application servers to make use of Webtrends software for tracking Web site usage statistics. With Virtuozzo, every physical machine runs a single virtualized Microsoft Windows operating system. Each virtual Web server can borrow the application template that streams the Webtrends software into the virtual environment. “As a result of not having to install Webtrends in the traditional sense on every virtual environment, the image size of each virtual machine is much smaller, so backups are much smaller…and smaller equals faster,” Pernigotti said. FDA administrators also find value in templates. “The advantage is it’s a prebuilt virtual machine that adheres to all your security standards,” Voss said.  Some organizations, however, have reservations about the use of templates. DCMA officials, for instance, consider them an administrative burden that adds to patching and other maintenance chores, and so it uses them only rarely, Amstutz said.  It might be tempting to assume that the nimbleness of virtualized servers will translate into lower administration and operations costs, but analysts caution against such expectations.“Don’t look for savings in terms of managing the individual servers,” Sloan said. However, software management vendors are working on products to help organizations streamline their management routines, which in turn might reduce management burdens and costs. Opsware officials, for instance, say they can tackle data-center automation for the physical and virtual worlds, provide users with a single view into managing both environments, perform configuration management actions on either, view virtual server dependencies, and track host and guest relationships across heterogeneous virtual environments. “What data-center automation is about is enabling you to manage all of that in one place, in a cohesive way, versus [managing] a bunch of point devices,” said Eric Vishria, vice president of marketing at Opsware. VMware officials are working on interoperability and integration, providing a set of application program interfaces and a software development kit that lets other management vendors connect to VMware’s database and management server and pull information.“All this is relatively new, and various vendors are in various stages of their evolution,”said Raghu Raghuram, vice president of product and solutions marketing at VMware. Microsoft is on its own track to building a management platform that doesn’t care if organizations have physical or virtual environments. It lets administrators manage both from a single console with tools such as Microsoft Operations Manager 2007 and the upcoming Configuration Manager. “What is becoming very apparent to IT as they virtualize the infrastructure is that management is going to be the key piece in this whole game,” said David Greschler, director of virtualization strategy at Microsoft.Sloan said management capabilities will be getting a lot more attention as the market for virtualization options becomes more competitive. “It will come down to not who gives you virtualization, but who gives you the most comprehensive management suite for your infrastructure, including virtual machines.”
























Physical meets virtual



































Cookie-cutter configurations


















Coming together




















Zaino has been covering business technology issues for the industry’s leading publications since 1986. She can be reached at
jennyzaino@hotmail.com.

Good people and processes are virtual goldThings move fast in virtualized environments. For many organizations, that means they must adjust processes for routine tasks such as configuring and patching systems to reflect the dynamism of the virtualized environment. Similarly, they must retool their employees’ skills to support virtualization.

“This is going to drive a lot of process re-engineering business, and drive process adoption and standardization,” said Stephen Elliot, a research manager at IDC.

Elliot said virtualization platform vendors will begin to develop more structured management techniques to ensure that tasks are standardized and produce an audit trail.

“There’s not as much visibility as one might think into the virtual machines from a change, configuration and compliance reporting standpoint,” Elliot said. The granularity that auditors are strict about — the type of information they need for tracking and reporting — is often not there, so custom coding is essential, he said.

Officials at management software vendor BMC Software said Information Technology Infrastructure Library (ITIL) best practices have an important place in the emerging world of virtualization. ITIL provides guidance for creating standard and efficient IT system management practices.

“A high percentage of outages are caused by changes, so you need good change management processes in place so you can figure out what you did and remediate it quickly,” said Chris Aherne, managing director of federal sales and operation at BMC. The company makes the BSM service management suite and other tools, including Performance Assurance for planning virtualization deployments. 

Organizations must also consider other processes. The Food and Drug Administration, for instance, is refining its chargeback model as internal agency customers upgrade their initial requests for resources, said Lowell Marshall, virtualization project manager at FDA’s Office of IT Shared Services. “With virtualization, there are lots of different scenarios that you have to keep abreast of,” he said.

Virtualization will also require new employee skills. A recent survey by Enterprise Management Associates found that 17 percent of organizations that have deployed virtualization technology believe their employees don’t have sufficient skills, and 35 percent think they might have sufficient skills.

Mariano Pernigotti, a senior systems engineer at CSSI who works as a contractor at the Federal Aviation Administration’s Air Traffic Control System Command Center, said people must be focused when they manage a virtual environment. Server administrators, distracted by ringing phones or urgent e-mail requests, could accidentally make changes on the wrong virtual machine if they are not careful.

“We need sharp people,” Pernigotti said.

— Jennifer Zaino
Tips for managing virtual environmentsWant to get a head start on deploying and managing your virtual environment? Here is some advice from people who have started down the virtualization path.

Begin with new hardware
The Defense Contract Management Agency created its virtual environment on a single hardware platform for consistent and repeatable installations. Dual-core Opteron processors from Advanced Micro Devices give the agency the most power for the dollar, said Peter Amstutz, chief of the Technical Requirements and Design Division at DCMA Information Technology.

If you haven’t already deployed a shared-storage architecture, do it now
“A [storage-area network] is the way that we are going to make things like migration, cloning and backups even faster, and it will allow us to scale a little bit more,” said Mariano Pernigotti, a senior systems engineer at CSSI who works as a contractor at the Federal Aviation Administration’s Air Traffic Control System Command Center. SAN storage also provides a basis for more granular management and better security, he added. 

Enhance staff skills to encompass storage, networking and server technology
“You need broader skills because setting up VMware is setting up an infrastructure,” said Michael Voss of Booz Allen Hamilton, who serves as  VMware technical lead at the Food and Drug Administration’s Office of IT Shared Services.

Make fault tolerance rather than backup and recovery the objective
That’s a goal for Grant Schneider, deputy director of the Information Management and Chief Information Officer Directorate at the Defense Intelligence Agency. DIA plans to virtualize its servers while it minimizes the number of data centers it maintains worldwide. As the agency moves down its virtualization path, it must set up applications, servers, storage and the rest of the environment to automatically fail over, Schneider said. “We can’t build things that don’t break, but when they do, we want [problems to be] transparent to our customers,” he said. That objective is more achievable in a virtual environment than a physical one, both in terms of speed and costs, he added.

Stick to the list
There are approved servers and storage systems for virtualization, and agencies should use them, said Jim Sweeney, enterprise solutions consultant at GTSI. “If it’s not on that list, we’re not going to implement it,” he said. “This is complex stuff.”

— Jennifer Zaino