Is government procurement ready for the cloud?

Mention cloud computing to true believers and you’ll likely hear all about speed and agility. They'll tell you that agencies can simply dial IT services up or down as needed to quickly support new mission plans or workload changes. As a bonus, agencies pay only for what they use instead of bankrolling the often idle, over-provisioned computing capacity common in most data centers.

Unfortunately, there’s a rub when it comes to the cloud. Many IT procurement practices and contracting vehicles were designed to help managers provision hardware and software, not on-demand services. Can the current acquisition practices translate easily to the dynamic world of cloud computing?


Joint Cloud Report:

Cloud procurement stumbling blocks

5 ways to prep for procuring cloud services


Not really, said Barry Brown, executive director of the Enterprise Data Management and Engineering Division at Customs and Border Protection. He echoed a view shared by others in the federal government. With cloud computing, “the technology delivery model has changed,” he said. "What has not changed is the procurement model."

The methodology gap between procuring IT systems and procuring IT services has been intensifying in the past year, ever since former Federal CIO Vivek Kundra outlined the government's cloud-first policy. That initiative seeks to reduce costs and increase IT acquisition flexibility by pushing federal IT systems to cloud environments. Each agency has until May to identify three IT resources that it will move to the cloud.

But the move is straining traditional procurement departments. Rather than promoting speed and agility, in some cases cloud initiatives are spawning extended contract negotiations and legal challenges that are making it take even longer for agencies to get the resources they need.

Not all the early obstacles are specific to the cloud, so they won't be permanent. But other features that are essential parts of the cloud model will continue to present challenges. Technology executives will need to accommodate them with new procurement and vendor management practices if the switch to on-demand, utility computing is to succeed.

Counterpoint

But not everyone agrees that cloud services represent such a significant departure from past IT practices that they require new acquisition methods. Some say only minor changes are needed for future cloud acquisitions to be well served by existing contracting vehicles, such as the General Services Administration’s Alliant governmentwide acquisition contract and IT Schedule 70 blanket purchase agreements, which specify firm fixed prices for cloud services negotiated on behalf of the entire federal government.

“I don’t think cloud procurement is as different or problematic as people make it out to be,” said Larry Allen, president of Allen Federal Business Partners, which provides procurement policy support for government contractors. “I’m not an advocate for creating new cloud-based contract vehicles. It’s much better to use what’s out there.”

In fact, for all the contracting uncertainties, agencies are making progress toward the cloud-first deadline. GSA and the National Oceanic and Atmospheric Administration are just two examples of agencies with large-scale cloud initiatives. Last year, GSA moved 17,000 staff members to Google Apps for Government, a cloud-based e-mail and collaboration system, and NOAA awarded an $11.5 million, three-year contract to migrate 25,000 employees to the Google messaging platform.

Wake-up calls

But cloud procurements don’t always go smoothly. In some cases, the problems are inherent to the cloud, such as determining how much customization of services, if any, is acceptable. In other cases, procurement officers are still sorting out when and how to apply existing rules to the cloud environment. Working through those issues can put the brakes on cloud procurements.

For example, in October 2011, the Government Accountability Office upheld a protest by Technosource Information Systems and TrueTandem that challenged a specification in a GSA request for quotations for cloud-based e-mail services. The RFQ required that data services be located in the United States or other designated countries.

GSA responded to the challenge in part by arguing that the government needs to control where information is stored because of concerns about foreign jurisdictions asserting access rights to data that resides in or moves through their country. Location would likely not have been an issue for agencies that opted to host services in-house, but in the cloud, data could conceivably be stored anywhere in the world.

Nevertheless, the challenge by the two contractors said the GSA requirement unduly restricted competition. GAO agreed, saying that GSA failed to establish a legitimate government need for the stipulation and calling on the agency to amend the RFQ to reflect its actual needs regarding data centers located outside the United States. After reviewing the decision, GSA issued an amended RFQ, nearly six months after issuing the original request.

Earlier, the Interior Department became embroiled in an even bigger contracting controversy after a lawsuit by Google put the brakes on a $59 million, five-year external private cloud intended to provide e-mail and collaboration capabilities for 88,000 of Interior’s employees. A lawsuit by Google charged that Interior’s request for proposals was “unduly restrictive of competition” because it specified a private cloud solution using Microsoft Business Productivity Online Standard Suite. Early last year, a federal judge sided with Google in a ruling that said Interior violated federal acquisition rules for open competition.

Part of the ruling stemmed from Interior’s choice of Microsoft technology, which the department had been using in a traditional implementation. The bigger question appeared to be Interior’s stipulation of a private cloud, which Google, as a supplier of technology for multi-tenant public cloud solutions, could not support.

Knowing that the private cloud stipulation might be challenged, Interior’s procurement and legal staffs tried to be proactive by documenting market research the agency had gathered about the potential risks of public clouds, said William Corrington, Interior’s CTO at the time and now cloud strategy lead at Stony Point Enterprises, a consulting firm that specializes in cloud strategies for federal agencies.

According to court documents, Interior said its research led it to a single-user, private cloud solution because of the sensitive nature of the data that would be stored in the cloud, the agency’s tolerance for risk, and “the benefits and liabilities of each cloud model.”

The case illustrates how questions about emerging cloud technologies add complexity to government procurements. As a result, some Interior officials felt they were being forced to accept undue risks because acquisition rules altered the agency’s original cloud choice, Corrington said.

The legal challenges also led to significant delays. Interior awarded the original contract in late 2010 but is still trying to move the project forward. In early January, the agency issued a new RFP that just now reopens the bidding. This time the department is calling for a commercial provider that can transition its current in-house e-mail systems to “an integrated, cost-effective, cloud solution.” It makes no mention of a private cloud or specific products.

Such legal challenges and protracted contract negotiations over sticking points such as security and service-level monitoring are prompting some observers to call for new methodologies to guide everyone in the procurement community.

“Our acquisition people are doing the best they can, but progress [toward cloud adoption] represents transformation and change for IT,” said Wolf Tombe, chief technology officer at Customs and Border Protection. “That transformation and change require that some of our partners and stakeholders change along with us.

The 2014 Federal 100

Get to know the 100 women and men honored this year for going above and beyond in federal IT.

Reader comments

Mon, Feb 6, 2012 Jaime Gracia Washington, DC

Cloud computing does not need yet more MACs. They already exist. What is needed is standardized processes for security and implementation, that can be used across the federal space in conjunction with industry collaboration. It requires the process upfront to be developed properly, and that is the requirements process, not the acquisition process. Rapid technology insertion requires this direct collaboration with industry, but also calculated risks based on best practices and a strong relationship with industry partners. There should not be a reinvention of the wheel here, just a focused and disciplined process to developing the government's needs. Easier said than done.

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above