Cloud security: Feds on cusp of change

Experts assess the future of certification in cloud computing

Editor's Note: This article was corrected May 6. The original article incorrectly stated if vendors are unable to meet minimum security requirements set by a new FedRAMP joint authorization board, they would be unable to meet requirements for all agencies.  Vendors may still be able to meet an individual agency's minimum security requirements, however, if an agency opts to accept greater risk and certify a system with a lower security bar than that set by the joint authorization board.

The federal government is on the cusp of fundamental changes in the way it manages information technology security risks, but those risks will grow more complicated as agencies begin embracing on-demand computing, according to a panel of public-sector cloud computing experts. The discussion was part of a May 4 technology conference in Washington, D.C., on cloud computing, knowledge management and open government innovations, put on by 1105 Government Information Group.

Coincidentally, the Treasury Department confirmed on the same day that it had shut down four Web sites hosted by a cloud-service provider after a security analyst found malicious code infecting them.

Security in a cloud computing environment needs to be considered as three distinct areas, said Chris Hoff, director of Cisco’s Cloud and Virtualization Solutions.

“There’s security in the cloud, which are tactical solutions; there’s security by the cloud, where security services are delivered via the cloud; and there’s security for the cloud that can provide security for other hosted services,” Hoff said. It’s also important to think of the how these security dimensions impact service levels, he added.

Security risks — and rules duplicating the work agencies must go through to certify the security of their information systems — remain one of the biggest obstacles to adopting cloud computing strategies, said Peter Mell, a computer scientist at the National Institute of Standards and Technologies and vice chair of the federal government’s Interagency Cloud Computing Advisory Council.

Mell outlined how a new government program called FedRAMP aims to address that problem by streamlining the certification process, so that an information technology application certified for one agency will be available for all agencies to use. And it would help industry too, he said.


Related Stories:

Treasury shuts down 4 cloud-hosted Web sites after infection

FedRAMP: Approve once, use often

Cloud computing is about to get easier

Cloud Computing Summit: Implications of the Cloud



“What FedRAMP is doing complements cloud security guidance,” he said, adding that the program, which is in the final stages of being introduced to agencies, has gained a lot of momentum and, importantly, has funding to get started, Mell said. “Our design is to publish security requirements so vendors can implement against those requirements and so their systems can be leveraged by all the agencies,” he said.

Developing minimum security requirements hasn’t been easy, he said.

“I thought we’d get agencies together and develop a baseline to build most of their needs, he said. “In practice, it is necessary to meet the minimum requirements of all the authorized agencies,” which include the General Services Administration, and the Homeland Security and Defense departments. Those agencies “are going to jointly authorize security requirements, but you have agencies that need to meet them — and the tendency is to set a fairly rigorous security bar,” he said.

Mell noted that if vendors can’t meet the FedRAMP minimum requirements, they may still be able to meet an agency's minimum security requirements if those agencies opt to take on more risk than would be accepted by the authoriziation board.

A primary concern raised during the panel discussion was how agencies would avoid getting locked into proprietary arrangements where infrastructure services and software services were inextricably interwoven.

“The underpinning of a lot of the cloud providers relies on a hybrid of infrastructure and software-driven solutions, Hoff said. “Those solutions have traditionally been disconnected from the infrastructure; they have sort of been floating on top. But when the software that powers [cloud computing services] is deployed, then you have another layer of security to consider.”

Mell's team "has been able to describe what’s needed, but vendors are struggling with interpreting what that means they must do,” Hoff said.

Continuous monitoring will also play a crucial role in cloud computing security. Mell said that, in addition to preparing guidance for virtualization and cloud computing, NIST is also considering developing guidance for continuous monitoring of network systems.

“It’s not just about how you’re protecting your systems and whether you have proper security, said Alex Hart, director of VMware’s Public Sector Channel Sales. "It’s also about when — not if ,but when — something happens, how quickly can you get back to steady state? In the commercial world, every minute down represents dollars lost. In the public sector, it’s more of a trust clock [ticking]. So if you can take your organization back to its previous state, prior to incident, that is key,” he said.

Another security concern connected to cloud computing is how to keep up with the velocity of changes, and whether certification and accreditation will still mean anything in a future where computing is done by service providers.

Mell said one answer — and the current design being considered by the Cloud Computing Advisory Council — “is for the service provider to provide a plan for how they will do continuous monitoring. FedRAMP will provide oversight, to ensure they are really doing what they said they would do."

The other answer, he said “is automation, automation, automation. My every intention is to plug in automation schemes as much as possible as they become available, which is line with our risk management approach.”

One sign that agencies and providers are making progress toward improved security is the movement away from focusing primarily on securing the perimeter of network systems, said Bob Wambach, senior director at EMC Corp.

Security has to be built throughout the system, he said, and it has to consistent throughout the infrastructure. “So as you converge the infrastructure into the cloud, security is built in using simple [standardized security] building blocks, rather than relying upon layers of security services,” he said.

Hoff, however, warned agency IT officials to be aware that even that approach has limitations. He said that while mass market or public utility service providers are committed to secure cloud computing, they operate in ways similar to socialist health care systems, where the notions of “good enough” and “maximum availability” generally prevail over the more specialized needs that agencies typically have.

Virtualization has been setting the stage for many of these issues for years, he said, but “what cloud computing is forcing us to look at is the survivability of systems…and protecting the data,” Hoff said.

“While I hear the perimeter is going away [as a security approach], I disagree; it’s multiplying and the diameter is contracting,” Hoff said. “You’re going to outsource responsibility, but not accountability. So we need open standards and better visibility.”

Reader comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above