Consolidation management presents key IRM challenge

The Office of Management and Budget's Bulletin No. 96-02, which instructs agencies to begin developing plans to close and consolidate small and midsize data centers, will drive a requirement for data and application audits on a scale never before imagined.

The OMB initiative will force federal information resources managers to re-evaluate existing systems and processing models to increase productivity and to administer, as well as integrate, new resources that may be brought under their purview. The program is fueling a new drive toward data and application management to achieve greater performance from installed resources and to deliver increased application reliability and availability.

The overriding direction of the OMB bulletin is toward resource and facility consolidation, a development that promises to create a series of super IRM operations. Highlights of the bulletin include:

A requirement that data centers provide a baseline processing capacity of at least 325 million instructions per second or face consolidation.

A strong recommendation for data processing outsourcing—either to larger federal centers or to industry.

The introduction of fee-for-service operating models.

This trend is spawning a new class of application management tools. These tools include intelligent agents that reside on top of applications, monitoring and managing those applications and feeding back management data to a central console in real time.

These commercial off-the-shelf technologies dramatically extend the reach of the database administrator—a key consideration when organizations are working to do more with less. Furthermore, by making this expertise available off the shelf, application management tools give organizations the opportunity to truly manage all installed resources.

Ironically, while the data center consolidation effort is focused on bringing components together, it will almost inevitably drive increased geographic dispersion of managed data and application resources. In many circumstances, resources will not be moved but will instead be managed remotely, increasing the need for remote resource management. And consolidation is requiring ever-tighter data management to optimize expensive bandwidth.

Within this dynamic environment, IRMs need new capabilities to remotely manage new applications and data structures introduced by consolidation. Data centers that have been built around DB2 and IMS as central repositories for mission-critical data will inherit instances of Oracle, Sybase, Informix and Ingres as well as Lotus Notes. Solutions based on MVS and VMS will be fused with Unix, NT, OS/2 and NetWare. Deployed network management solutions such as Hewlett-Packard Co.'s OpenView, Digital Equipment Corp.'s PolyCenter, Sun Microsystems Inc.'s NetManager, Tivoli Systems Inc.'s Tivoli and Computer Associates International Inc.'s UniCenter will need to be integrated if management is to receive a true picture of the health of all fielded resources.

The OMB bulletin recommends evaluating the viability of smaller agencies' data centers, proposing "outsourcing" of the organic function to larger federal data centers or commercial organizations where appropriate. Discussion of outsourcing necessitates "fee for service" models for surviving data centers. All these factors point to a more competitive operating model for federal data center managers.

In short, the OMB bulletin makes it clear that federal data centers must deliver the required services, the required application and data availability, and the required performance if they are to survive.

Migrating to Client/Server

While still unfolding on the mission-critical scale, the move to distributed processing models offers a wealth of efficiency enhancement. However, many IRMs have shunned client/server migration because of the risks associated with relying on less structured environments. In mission-critical environments, 24-hours-a-day, seven-days-a-week data reliability is an absolute requirement. A performance enhancement or cost saving is only worth pursuing if you are sure it will not undermine the mission-critical function supported.

In addition to the instabilities that have characterized client/server Unix and NT environments, the serious data and application managers have been disaffected by the pragmatics associated with inserting new open-systems technologies into legacy mainframe MVS and minicomputer VAX/VMS shops. While profiles and interfaces exist to facilitate connectivity, the challenges of managing data across scalable environments have been prohibitive.

Application and data management technologies exist that can provide the "control infrastructure" to allow organizations to move mission-critical applications off proprietary mainframe platforms without risking IRM careers. We are not talking about the death of the mainframe. On the contrary, we are talking about introducing new distributed Unix, NT and OS/2 components to existing centralized environments.

Application and data management tools provide a pragmatic counterpoint to the management chaos created by open systems. As standards have enhanced interoperability, the complexity of the management challenge has grown as well. The expertise of application and database administrators who had focused on one area must now be leveraged throughout the organization. Application and data management software provide the control infrastructure to proactively manage complexity and optimize diversity.

The walls within and among federal agencies are being torn down in search of increased data processing efficiencies. The winners in this new Darwinian environment will be those professionals who proactively leverage their skill sets to deliver the most efficient and reliable processing resources. That means client/server; it means managing remote resources; it means maximizing system availability and dependability; and it means new integrated, scalable toolkits will play an increasingly critical role in addressing these challenges.

**

With over 20 years' experience in the federal IT community, Cullen is vice president and general manager of federal operations at BMC Software Inc. She can be reached at kitty_cullen@bmc.com.

Featured

  • IT Modernization
    Eisenhower Executive Office Building (Image: Wikimedia Commons)

    OMB's user guide to the MGT Act

    The Office of Management and Budget is working on a rules-of-the-road document to cover how agencies can seek and use funds under the MGT Act.

  • global network (Pushish Images/Shutterstock.com)

    As others see us -- a few surprises

    A recent dinner with civil servants from Asia delivered some interesting insights, Steve Kelman writes.

  • FCW Perspectives
    cloud (Singkham/Shutterstock.com)

    A smarter approach to cloud

    Advances in cloud technology are shifting the focus toward choosing the right tool for the job and crafting solutions that truly modernize systems.

Stay Connected

FCW Update

Sign up for our newsletter.

I agree to this site's Privacy Policy.