Why you can quit worrying about cloud security

Cloud security isn't as scary as you think, as long as you understand and respect these 4 points

At first blush, the cloud-first mandate that federal CIO Vivek Kundra announced late last year makes sense. By spurring agencies to send one application to the cloud in the next year and two more shortly after that, it offers a clear path for cutting IT costs and consolidating federal data centers — two long-standing goals of the Obama administration.

But digging deeper, agency CIOs find themselves facing a harsh reality: Are budget imperatives superseding equally valid concerns about the safety of government systems and data in today’s still-evolving cloud environments?

The answer depends in part on how quickly IT managers can enact new security techniques tailored for the cloud, some of which are still works in progress.

“We must push the envelope,” said James Williams, CIO at NASA’s Ames Research Center, which is developing the Nebula infrastructure as a service offering for the entire agency. “It's not so much about making the cloud secure but about using the cloud to leverage best practices in security across an enterprise.”

Security doesn’t have to be the bogeyman that it’s typically been in cloud discussions, but lingering fears and false impressions can be hard to shake. Here are four areas in which the popular views on cloud security are wrong or don’t tell the whole story. In fact, there are — or soon will be — practical solutions to all the main cloud security concerns.

1. Sharing the cloud with strangers isn't always a deal breaker.

Cloud providers can deliver IT services more economically than in-house data centers by pooling large numbers of organizations on shared computing resources. Unfortunately, those multitenant environments, especially when operated by commercial providers, have raised fears of data leakage and unauthorized access to sensitive systems.

Many experts say those worries do have some validity. “Harvesting information from [a] cache or out of system memory are still viable risks,” said Michael Donovan, chief technologist of strategic capabilities at Hewlett-Packard.

Another potential problem arises with how cloud providers report evidence of potential breaches. Too little information blinds cloud users to possible threats, while unrestricted reporting might actually make everyone more vulnerable. For example, reporting on a vulnerability that affects multiple tenants might give one customer an opportunity to breach another’s security defenses before the gap is closed.

“That which allows you to protect yourself also allows you to attack other co-tenants,” said Mark Rasch, director of cybersecurity and privacy consulting at Computer Sciences Corp.

Those risks are real, but they shouldn’t be deal breakers if proper steps are taken, especially given the potential financial rewards of multitenancy services. “You make a mistake if, in order to get security, you avoid co-tenancy entirely,” Rasch said.

There are ways to make such environments safer. At the Treasury Department, for example, officials are choosy about what they send to the cloud.

“We are evaluating public clouds on a case-by-case basis,” said Diane Litman, Treasury’s acting deputy assistant secretary of information systems and CIO. The agency’s flagship site, Treasury.gov, recently relaunched on a commercial, public cloud platform, a first for a Cabinet-level agency. Related sites, such as the Consumer Financial Protection Bureau’s, are also slated to run in a public environment. Both were appropriate for public clouds because they contain information that’s already publicly available, Litman said.

By contrast, Treasury continues to take a cautious approach toward multitenant solutions for highly sensitive information, such as IRS tax files, until officials feel assurance that the sensitive information will be adequately protected, she added.

NASA is incorporating virtual local-area networks and data encryption into its cloud for added peace of mind. The agency’s Nebula will combine those security technologies to segment data traffic and make it unintelligible even if an unauthorized party were to access it.

But Williams warned that cloud customers need to look below the surface. “Serious attention must be paid to crypto-implementation for processing and storage,” he said. He advises administrators to investigate each provider’s encryption strategy to answer the ultimate question: “Do you trust the algorithm as implemented by the vendor?”

2. FedRAMP is good start, but only the beginning.

Federal officials are optimistic that the budding Federal Risk and Authorization Management Program will simplify cloud security, but agencies shouldn’t let their guards down. Even after it’s finalized, don’t expect FedRAMP to relieve you of all security burdens.

FedRAMP, a draft of which is now out for public comment, would allow multiple agencies to replace their individual audits of a cloud provider with a single security inspection shared by all government cloud users. FedRAMP is expected to be finalized sometime this fiscal year.

In addition to relieving individual agencies of compliance obligations, FedRAMP could also speed cloud adoption for agencies attempting to meet cloud-first milestones. Having a governmentwide standard for security inspection protocols would also formalize discussions between cloud providers and agency customers.

“We’ve gone into cloud engagements as a systems integrator [for an agency] and asked cloud providers if we could get a list of who has access to a client’s systems,” said Patrick Cronin, executive consultant at IT services company CGI. “They’ve just said no. FedRAMP is going to level the playing field.”

But Alan Paller, director of research at the SANS Institute, criticized FedRAMP in a blog post for relying too much on low- and medium-level implementations of the dozens of security controls outlined in the National Institute of Standards and Technology’s Special Publication 800-53. At those levels, controls to protect against unauthorized access and other threats are in place, but they aren’t fleshed out with additional practices for continuously analyzing and testing their effectiveness.

Paller called for more emphasis on the 20 critical controls described in the Consensus Audit Guidelines produced by a consortium of federal agencies and private organizations. The controls range from performing audits to identify authorized and unauthorized devices to implementing data loss prevention systems.

Although there is some overlap between CAG and SP 800-53, the guidelines differ in significant ways. Specifically, CAG focuses on a core subset of security practices, three-quarters of which can be monitored automatically and continuously.

Questions like those are leading some IT managers to see FedRAMP as only a first step in cloud security. That means agencies will need to determine what additional security controls and ongoing performance information are relevant to their particular situations.

“Cloud is a subset of IT, and FedRAMP will help us understand the security controls in that subset,” said Richard Hale, chief information assurance executive at the Defense Information Systems Agency. “But the totality of [IT security] is still going to be each agency’s responsibility.” Hale oversees security for the Rapid Access Computing Environment, DISA’s platform as a service for Defense Department agencies.

What areas are outside FedRAMP’s oversight? Two important ones are account management for cloud users and the security of Web browsers, which are sometimes the prime ways for agency employees to tap into a cloud.

“If someone hijacks your browser and sees all your keystrokes, then they can hijack your cloud account,” said Lee Badger, a computer scientist and acting program manager for cloud computing at NIST.

 3 Outsourcing to the cloud? Don't abdicate on security

Cloud computing increases the importance of a security best practice that every agency CIO might soon need to implement: continuous monitoring of IT resources and activities.

Continuous monitoring generates a series of reports at regular intervals. The draft version of FedRAMP calls for quarterly and annual testing. Its main purpose is to assure cloud users that the security controls accredited during the full system audit continue to work as advertised.

“In a dynamic cloud environment, there’s more work that has to be done to make sure that what you’ve checked today is still valid tomorrow,” Donovan said. “Continuous monitoring is one aspect of that.”

But Paller and other security experts say that for continuous monitoring to be effective, reports must be run every day or two, which is much more frequent than what FedRAMP now calls for.

Williams agrees, adding that NASA’s Nebula is adopting what he calls a situational awareness strategy for near-real-time monitoring of risk for its agency cloud customers. For example, because NASA’s cloud can immediately detect changes to any of its IT resources, a new request for an IP address kicks off a security scan, he said.

Continuous monitoring will also help keep the Federal Information Security Management Act relevant for cloud environments. FISMA requires agencies to provide the Office of Management and Budget with status reports about their IT security systems. Originally, agencies filed paper-based updates once a year, but an executive order issued last year added real-time status reporting to supplement annual FISMA documents. Agencies must include outside contractors in FISMA reviews, which now bring in cloud providers.

“I think that the old FISMA is obsolete, but the new [certification and accreditation guidelines in] FISMA are moving in the right direction with continuous monitoring,” said Curt Aubley, vice president and CTO for cybersecurity and next-generation solutions at Lockheed Martin, which sells a turnkey private cloud solution that complies with the revised FISMA standard.

FedRAMP will complement the FISMA process by auditing cloud providers on their ability to meet a range of security requirements drawn from NIST’s guidance. That eliminates the need for agencies to perform their own reviews of cloud providers under FISMA, although agencies will still need to report their use of FedRAMP-authorized clouds in their FISMA filings.

In addition to hammering out continuous monitoring requirements with cloud providers, agencies should also insist on compliance with NIST’s Security Content Automation Protocol. That protocol spells out standards for how cloud providers format and deliver data from continuous monitoring systems so agencies can quickly make sense of the information they receive.

4. Off-the-shelf security terms are often negotiable.

Not all cloud security challenges are caused by still-evolving best practices and immature technologies. Some are the result of ongoing confusion about where a cloud service provider’s data management responsibilities end and the agency’s begin.

For example, don’t assume that the cloud provider will automatically back up data and store it on off-site tapes — a reasonable assumption under long-standing data protection practices. Similarly, a traditional intrusion detection system might not be included in a standard cloud contract.

“Those are services you can add, but if you don’t ask, you are not getting them oftentimes,” Cronin said.

Avoid unpleasant surprises and finger-pointing by diligently combing through cloud quotes to clearly understand what is being provided. And be ready to negotiate for anything that’s not spelled out in the document.

Contract negotiations are also the time when agencies should address another long-standing concern about cloud security: offshore data farms.

“We worked through our concerns with the provider to reduce the risk that was acceptable to us by focusing specifically on where the data was stored and who has access to it,” Litman said. “We also dealt with jurisdictional issues in terms of the ability to prosecute any illegal activity.”

Agencies have support in this area. The General Services Administration’s blanket purchase agreement for cloud storage, virtualization and Web-hosting solutions requires cloud providers to use U.S. data centers, Cronin said. In addition, he expects FedRAMP to include the same requirement.

In addition, contract negotiations are an opportunity to address the expectations for continuous monitoring by making the practice a cornerstone of the compliance sections of contracts negotiated with public cloud providers.

But beware: Large federal agencies with their impressive buying power might have the most leverage when it comes to expanding service-level agreements. “Contracts are often one size fits all for state and local governments or a small city that wants to go to the cloud,” Rasch said.

Will measures like these make clouds secure enough for agencies to safely comply with the cloud-first mandate? Some industry observers are optimistic.

“There is initially a belief that the cloud may not be as secure as [an agency’s] own infrastructure,” Cronin said. “But a cloud solution can be more secure than many federal systems that are on legacy infrastructures using legacy controls.”

The 2014 Federal 100

Get to know the 100 women and men honored this year for going above and beyond in federal IT.

Reader comments

Fri, Feb 25, 2011 Reuben Snipper HHS/ASPE

I think this article did not go far enough in addressing several issues around cloud computing. First, cloud computing means you are totally dependent on the vendor to report security and capacity problems. They have little built in incentive to report on either of these issues for business reasons. Second, during a failure, it can be quite difficult to tell what the problem is. Some failures can be hard to locate between the extra layers. Recent problems with major cloud computing vendors has illustrated this problem. Finally, in many environments, the network may not be set up to support WAN capacity and reliability. The architecture may have focused on internal traffic and not on the connection to the Internet and therefore the cloud. Upgrading the capacity and reliability of these systems can eat up a lot of the savings.

Wed, Feb 2, 2011 Sam Nicholson ogt11.com

So, I can quit worrying about cloud security if I properly implement security with my cloud initiatives. With that approach I can also stop worrying about cancer if I don't get cancer. Bad headline. Decent article.

Wed, Feb 2, 2011 John Sutton Fairfax

I am working with leaders in the VMware space and Information Assurance companies around continous compliance monitoring. It is interesting that more of the innovation for monitoring information is coming from smaller, private companies that have tools that are still not being reviewed by government agencies who tend to rely on the large public companies/industry leaders, but not really leaders in innovation in security-in-the-cloud. Hopefully there will be more clarity with organizations like Cloud Security Alliance and new people coming into the fold. One of the biggest issues with many of these new cloud providers winning contracts in the government space is the lack of monitoring and management tools because many of these groups don't have the available APIs that allow for maximum ability to continuously monitor. Or, they claim to do all the monitoring within their own solution: What about the third-party access to monitor and enforce security compliance? Dual controls? Secondary validation?

Tue, Feb 1, 2011

One point that just drives me nuts is the assertion that FISMA is about C&A and that migration away from C&A makes FISMA obsolete. And it is just as irksome when people like Curt Aubley who supposedly are industry leaders perpetuate that myth. (“I think that the old FISMA is obsolete, but the new [certification and accreditation guidelines in] FISMA are moving in the right direction with continuous monitoring,” said Curt Aubley, vice president and CTO for cybersecurity and next-generation solutions at Lockheed Martin.) FISMA is not about C&A. OMB's implementation of FISMA under the Bush administration mistakenly chose C&A as a metric (and ultimately failed to improve federal IT security while simultaneously wasting hundreds of millions of dollars applying C&A to legacy systems.) The core tenents of FISMA are still valid: Accountability by the Head of the Agency, risk management, including IT security in the planning and budget process, consolidation of security standards between civilian, DoD, and Intelligency Agencies, etc. FISMA was designed to fix a management problem (thus the term "management" in the title). It was not designed to implement C&A and linking FISMA to C&A distracted the government from the real intent of FISMA.

Tue, Feb 1, 2011 John Denver

Security is always a concern...but not the only reason to question the rush to virtualization. We've implemented cloud apps and found, that for middleware, we need physically equivalent-sized VM's, which makes the effort to get into the cloud something that only benefits the vendors that are licensing cloud software. The effect on our admins is now they have an additional layer to manage - no savings there. We needed to purchase the Virtualization software in addition to all the other standard needs - no savings there. We need to purchase about as much hardware and power it, no real savings there. We need to ask ourselves why we're doing this, then ask our IT leaders why they are forcing this model, even when it oftentimes doesn't make sense. The cloud has its place, but it doesn't belong everywhere.

Show All Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above