Cloud computing: Is it secure enough?

New risks could cancel out potential benefits

Cloud computing might now officially be the hottest trend in enterprise computing, thanks in no small part to the Obama administration's public endorsements. But many federal chief information officers still lack a clear picture of the security and management challenges that come with moving data, applications and business processes into the information technology ether.

How does the cloud play with FISMA?

A team at the National Institute of Standards and Technology is preparing formal guidance scheduled for release this summer on cloud computing adoption. There will be a chapter devoted to security, but here is a preview from several government sources of how cloud computing strategies will mesh with agencies’ obligations under the Federal Information Security Management Act.


NIST's FISMA team has determined that the existing NIST 800-53 "Recommended Security Controls for Federal Information Systems" guidance is applicable to cloud computing environments without alteration.

FISMA provides cloud computing vendors with a common set of security practices for their federal clients.

Standardized information technology infrastructures and in-house security expertise from cloud computing vendors might help agencies raise their FISMA scores.


Cloud computing's multi-tenancy platforms and the lack of direct agency control over security could introduce new risks.

Clouds make organizational boundaries fuzzy, complicating definitions of what falls under FISMA evaluations.

FISMA snapshot audits make evaluations of cloud security difficult. However, NIST is planning to recommend that a single agency perform certifications and accreditations on behalf of other agencies and possibly the entire federal government.

The good news is that there are protections to guard against threats and mismanagement, and in some cases, the solutions rely on already familiar technologies.

Cloud computing received a major boost from President Barack Obama's fiscal 2010 budget request and a related White House report in May that called for a transformation of federal IT with the widespread adoption of the cloud delivery model.

Proponents such as federal CIO Vivek Kundra see greater efficiencies and lower costs when agencies partner with government or commercial service providers that deliver applications and IT resources through subscriptions for software, infrastructures and platforms. Yet most agencies have little experience with this new model.

The worries about cloud security aren’t entirely theoretical. Commercial cloud provider Google recently found a flaw in its Google Docs software-as-a-service application that inadvertently caused it to share user files. Elsewhere, an employee for SaaS provider was duped by a phishing attack and leaked a customer list, said Chenxi Wang, principal analyst of security and risk management at Forrester Research. Both Google and count numerous government agencies among their customers.

Agencies don’t need to go it alone as they try to put these early breaches into a larger security context. The National Institute of Standards and Technology is due to release formal guidance on cloud computing adoption this summer, with a chapter devoted to security.

In the meantime, as the administration and others push for clouds, agency officials and IT consultants said CIOs should start now to understand the security implications and begin taking steps to protect their organizations.

As part of the government’s new cloud computing push, the General Services Administration and the federal CIO Council are part of a working group that Kundra has charged with helping to explore the strategy’s implications.

“We in the working group believe that cloud computing offers the promise of improved security, through the sharing of a common infrastructure that lets agencies tap into a common pool of top information security experts,” said GSA CIO Casey Coleman.

But Coleman and others said clouds also experience all of the threats faced by in-house IT platforms, with two additional challenges.

Insider threats

Traditional IT security practices focus on strong outer shell defenses in the form of firewalls and intrusion detection systems. The value of that approach diminishes with cloud computing, said Peter Mell, NIST senior computer scientist and leader of the agency's cloud computing project.

“We’ve always needed a strong center, but in cloud computing, it’s become even more obvious that that’s necessary,” Mell said.

The reason is multi-tenancy, the industry term for serving various subscribers through a common pool of IT resources. Multi-tenancy differs from typical outsourcing and co-location arrangements in which clients run their applications and data on computer servers dedicated just to them. Multi-tenancy might provide data thieves with an opportunity for one-stop shopping, when, for example, a staff member from one client naively opens an e-mail attachment and lets in a virus.

“The possibility exists that if a hacker can get into one organization’s information it may then be able to move laterally and potentially get at everybody’s information,” said Jim Reavis, executive director of the Cloud Security Alliance.

Control issues

Rather than directly managing security for internal IT systems, cloud subscribers must indirectly depend on their service providers to put effective security practices in place and then maintain them. This lack of direct control might affect security in a number of ways.

For example, agencies no longer depend on privileged users, such as systems administrators with wide access rights to data and applications. Nor can they directly monitor or discipline those users to guard against missteps.

In addition, disposing of data and storage devices when they are no longer needed can be another challenge.

“How do you know the data’s really been deleted?” Reavis asked. “This brings up issues with how data is archived and how storage media is retired, to avoid a scenario where information starts showing up when those devices get recycled.”

Another concern is that because agencies aren’t responsible for storing their data, they need to identify not only the cloud provider’s main data centers but also the locations of backup and archive locations.

“Cloud providers operate on a global basis, and you can expect to run into regulatory conflicts if you don’t do your due diligence,” Reavis said.

New tactics

As formidable as these challenges may be, experts say overcoming them is not rocket science. The solutions may actually seem very familiar to agency CIOs. 

“It’s more about rethinking — not reinventing — the enterprise security model,” said David Linthicum, principal at Booz Allen Hamilton. “Most of the things we have been doing in terms of security carry over nicely to cloud computing.”

These include a range of technologies for data encryption, data segregation and identity management. For example, data encryption and user IDs based on public identity certificates and Common Access Cards are tools being used by the Defense Information Systems Agency in its SaaS initiatives.

In addition, a series of access control measures can help guard against unauthorized data becoming available to cloud users operating in a multi-tenant environment, said Henry Sienkiewicz, DISA technical program director of computer services.

“Access control down to the individual layer, as well as even down possibly to individual data elements, becomes a crucial component,” Sienkiewicz said. “You build in intrusion detection not just on the external side; you build it on the internal side. You build in access control based on roles, responsibilities and content throughout the environment.”

Although federal cloud subscribers don’t have to sweat the details of implementing those controls, they’re not completely off the hook when it comes to security. Cloud users must devote significant time and resources to evaluating cloud providers, and security consultants warn that these analyses shouldn’t be hasty, even if the cloud is managed by another federal agency.

Some cloud customers are following the advice of consultants and using the accounting industry’s SAS 70 Type II audits of internal controls or the ISO’s 27001 information security standards to assess cloud providers. Experts caution that these regimens offer only high-level security information.

“They’re good starting points because they give you a baseline of the vendor’s practices," Wang said. "However, those audits are pretty subjective."

In addition to those results, agencies should ask their own detailed questions about how the provider protects data, what happens to data once the contract ends, and what procedures are in place if agency data is ever breached.

“Ask for a service-level agreement that’s as detailed as possible about the recourse actions to cover your bases,” Wang said.

Some security experts recommended that agencies forgo traditional moment-in-time audits of a cloud’s security performance in favor of assessments that provide ongoing insights.

One choice, from a consortium that includes Center for Strategic and International Studies, is the 20 Critical Controls, which promotes automatic and continuous security performance monitoring and draws on guidance from the National Security Agency, U.S. Computer Emergency Readiness Team, Defense Department and other federal sources.

“The foundation for the 20 controls is what vulnerabilities are attackers exploiting today, what controls are effective against those attacks, and how you can validate those controls with automated means,” said John Gilligan, an IT consultant and consortium member who formerly was CIO of the Air Force and Energy Department.

Agencies should ask cloud providers to provide their scores on the 20 Critical Controls as part of the bidding process during a procurement, said Alan Paller, director of research for the SANS Institute. Agencies should also insist that vendors continuously report results as part of any service-level agreement between an agency and provider.

Even after finding a cloud provider with the right security mechanisms and oversight in place, agencies should move slowly.

“It’s about selecting the right applications to move to a cloud-based service,” said Mark Nicolett, vice president and research director at Gartner. For most agencies it will be an application-by-application decision, with the lower-risk applications moving into a cloud environment first.

GSA’s Coleman said many of those decisions will go beyond weighing general security concerns related to clouds to considering what kinds of clouds are appropriate for individual applications.

“One of the key discussions will be private cloud vs. public cloud,” she said. “Cloud computing isn’t a one-size-fits-all solution. Depending on the problem you’re trying to solve, it may not be the right solution in every case."

About the Author

Alan Joch is a freelance writer based in New Hampshire.


  • Defense
    concept image of radio communication (DARPA)

    What to look for in DOD's coming spectrum strategy

    Interoperability, integration and JADC2 are likely to figure into an updated electromagnetic spectrum strategy expected soon from the Department of Defense.

  • FCW Perspectives
    data funnel (anttoniart/

    Real-world data management

    The pandemic has put new demands on data teams, but old obstacles are still hindering agency efforts.

Stay Connected