CIO Perspective

IT infrastructure: The role of the cloud

Richard Spires

In my previous columns on IT infrastructure, I discussed the importance of moving to a modern, standardized and consolidated IT infrastructure, at least at an agency-by-agency level. Such infrastructure rationalization is foundational for enabling IT to be most effective in helping government mission or business customers.

In my columns, I presented four structural obstacles that have greatly inhibited the federal government’s ability to make significant progress on IT infrastructure rationalization, along with six approaches government can take to surmount those obstacles.

Before moving on to other topics, let’s address what I mean by a modern, standardized and consolidated IT infrastructure.

There are a number of viable models that can work for an agency, depending on the size and complexity of its IT systems. Perhaps ironically, I find it concerning that an agency would rely on only one vendor to provide its IT infrastructure capabilities, even though one would think that is the very definition of consolidated.

Instead, my experience shows that government is better served when there is regular competition for such services or at least the real threat of such competition. Too often, agencies get locked into long-term contracts that do not provide mechanisms for ensuring that an agency is keeping pace with ongoing improvements in technology. As a result, the agency falls further behind, making it exceedingly difficult to introduce new capabilities and reduce overall operating costs.

Agencies must take advantage of an approach that aligns with the mature commercial business models that serve large private-sector firms. A key component of an agency’s IT infrastructure strategy should be leveraging cloud computing capabilities -- private clouds hosted at government data centers or at a vendor’s facility, together with public cloud services.

The business model is compelling because it lowers overall capital costs and moves agencies to a consumption-based model. What is equally compelling is that cloud services are easy to benchmark in terms of service quality and cost, which enables agencies to measure whether they are getting service at competitive prices. That helps ensure a fair deal today, and as services evolve, agencies can continue to benchmark offerings to ensure that their cloud service providers are staying competitive.

So what might a modern, standard and consolidated infrastructure look like at a government agency?

In most cases, agencies will need an ongoing brick-and-mortar data center. Depending on its size and complexity, an agency might need multiple data centers to provide high-availability and disaster recovery capabilities. That might justify a small number of physical data centers but not the dozens or even hundreds that still exist at some of the large agencies. Some legacy applications cannot easily live in cloud architectures, and applications that house highly sensitive and even classified systems will need the physical security controls of a dedicated data center. However, agencies should have plans to modernize legacy systems to at least enable them to move to a highly virtualized environment.

Furthermore, agencies should be migrating most of their applications from stand-alone servers dedicated to individual systems and to cloud services using production, development and test-as-a-service models. Enterprise commodity applications should be migrated to software-as-a-service models, with applications like email and SharePoint leading the way. Some agencies are now getting more creative and looking at other enterprise SaaS offerings, including business intelligence and customer relationship management. Cloud models are even being used for virtual desktops and mobile device management, enabling agencies to move away from buying and directly managing end-user devices.

A continuing key concern about cloud computing is security. Agencies should deploy a set of private cloud services in their brick-and-mortar data centers for applications and infrastructure that handle sensitive data. Applications that use non-sensitive data can rely on public cloud services.

I am drawn to that model for several reasons:

1. It will continue to spawn competition for data center and private cloud services. As the Federal Risk and Authorization Management Program (FedRAMP) matures and as public cloud service providers address security concerns, it will become safe to move more sensitive data to public cloud services. Vendors that provide existing data center services have ample reason to ensure that they stay competitive -- namely, the looming threat offered by public cloud service providers.

2. By migrating applications to a private cloud capability now, agencies will have standardized and modernized, enabling them to more easily move those applications to public cloud providers sometime in the future.

3. Cloud brokerage models are evolving that will foster even more competition. The models allow agencies, via a broker, to easily shop their applications among multiple public cloud service providers.

Returning to the uniting theme of my columns on IT infrastructure, the key to making any of these efforts succeed is a commitment to consolidating IT infrastructure. Agencies need scale to deploy a sophisticated model of enterprise data centers, along with the use of private and public clouds that can drive significant efficiencies.

If an agency allows each of its programs or offices to go it alone, there might be some use of cloud computing on a program-by-program basis, but little efficiency will be gained and no standardization will happen. As a result, the agency is still faced with daunting complexity, inefficiency and lack of flexibility in its IT infrastructure.

 

About the Author

Richard A. Spires has been in the IT field for more than 30 years, with eight years in federal government service. He served as the lead for the Business Systems Modernization program at the IRS, then served as CIO and deputy commissioner for operations support, before moving to the Department of Homeland Security to serve as CIO of that agency. He is now CEO of Resilient Network Systems.

Who's Fed 100-worthy?

Nominations are now open for the 2015 Federal 100 awards. Get the details and submit your picks!

Featured

Reader comments

Mon, Jan 27, 2014 shawn Hendricks Washington, DC

Really nice article Richard. I agree with your premise. I also agree that consolidation should be at the component level. If the system gets too big, change and competition get very difficult. The end result could be vendor lock (see Ma Bell) which would drive prices up and would eliminate the threat of competition. Also, federal agencies must realize that we are a small segment of the overall market. As such, we need to be cautious of unique requirements in this commercially based sector. As we become different, we potentially turn many vendors competing into only a handful.

Wed, Jan 22, 2014 Michael Duffy

Richard's advice to Federal agencies is right on target. I would add 2 points for consideration. #1: Cloud-based infrastructure should help agencies increase the "speed to market" for new applications/on-line services, by eliminating the procurement & deployment cycle for hardware and software, and enabling rapid proto-typing and testing of solutions on a variety of virtual platforms. #2: Top-tier Cloud providers should be able to match or exceed security and reliability benchmarks for most civilian agencies. For these reasons and the ones Richard cited, agencies should be asking "how do I use Cloud-based services", not "should I use Cloud-based services."

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above