A data-driven approach to data center optimization

The first step for agencies looking to consolidate: taking stock of existing assets.

data center cages

Nearly eight years after the Federal Data Center Consolidation Initiative debuted, agencies are still struggling to get a handle on their on-premise systems. As Rep. Gerry Connolly (D-Va.) noted earlier this year in advocating for the Federal Information Technology Acquisition Reform Act (FITARA) Enhancement Act, the challenge was significantly more serious than had been originally understood. What initially appeared to be 1,100 government data centers turned out to be 11,700, which obviously requires much more time under FITARA's data center consolidation provisions.

Today, there are still many questions about how to accomplish this Herculean task.

Data center consolidation starts with planning, and planning begins by taking a detailed inventory of resources in each environment to be consolidated. While this sounds straightforward, collecting information about IT assets can be quite challenging for several reasons.

First, for data sources that have operated independently, consolidation typically results in redundant or duplicated data. Removing duplicate information minimizes the total amount of data that requires consolidation ultimately.

Next, independent data sources collect all kinds of asset information, including data that is irrelevant to asset inventory analysis. Service and support notices are examples of asset data that should be removed.

Further, it is quite common for the name of the same company or product to be entered into organizational systems in multiple ways (e.g., ABC Corporation, ABC Corp., etc.). Normalization is the process that rationalizes all the potential names and abbreviations of a product or organization into a single standard.

Ever-changing external market information is another challenge not to be overlooked. Examples include information on product compatibility, product end-of-life (EOL), product end-of-support (EOS), hardware specifications and software vulnerabilities.

Manual vs. automated collection

Ask IT staffers who have manually performed asset deduplication and normalization about their experiences. They will say these projects were extremely time-consuming, especially the manual effort required to collect, clean and normalize all hardware and software asset information, which can reach nearly 9,000 hours in a typical project. This equates to 225 weeks, or more than four years of staff time!

Execution

Armed with comprehensive asset information, IT teams can consolidate applications with confidence. Cost savings result when redundant systems are consolidated and/or eliminated, and software licenses are consolidated to improve purchase negotiation.

Here are four key areas where accurate, comprehensive asset information benefits the data center consolidation process:

  1. Software License Compliance
    Vendors can audit your organization at any time for software license compliance, and everyone understands the penalties for using unlicensed software. On the plus side, the identification of identical software purchases puts the IT team into a better position to combine purchases into a single contract, with the potential for improved price negotiation opportunities.
  2. Vulnerabilities
    EOL/EOS software is a hacker target, and poses a major security risk. During consolidation, IT team members will want to review risks posed by highly vulnerable software. Asset information that includes software vulnerability statistics is a useful means of measuring the potential of this security risk.
  3. Software and Hardware Compatibility
    Data center consolidation provides an opportunity to re-evaluate things that might otherwise become business as usual. Given the potentially large number of applications to support, which applications require upgrades? If a server upgrade is necessary, are applications compatible? Could one or more server be a candidate for migration to a virtual hosted server?
  4. Hardware Real Estate Planning
    A big challenge of consolidating data centers is understanding the ramifications of consolidating the real estate as part of the process. Planning, in general, for the real estate of a new data center is a daunting task. The mix of hardware vendors and devices can be a challenge, regardless of how much space the new data center offers. Consolidated information eliminates wasted resource time in gathering and researching the information required.

Data center consolidation begins with taking stock of the assets for consolidation. By building this comprehensive view of one’s assets, and refreshing it continually throughout the consolidation process and even beyond, IT teams set themselves up to gain long-term benefits, such as the removal of duplicated information, data normalization, the identification of potential security risk vulnerabilities and pooled licenses for stronger negotiating.