FISMA standards will calm cloud fears, McClure believes

The federal government might not be in a mad march to migrate its most sensitive data to the cloud, but as standards become more cemented and processes ironed out, more agencies will move into the space perceived “as a little bit risky at the moment,” according to the predictions of a General Service Administration official.

As the Federal Risk and Authorization Management Program gets rolled out and agencies get comfortable with the new standards and the rigid review process, there will be a greater tendency to move higher-risk systems and data to an outsourced model, said David McClure, associate administrator at GSA's Office of Citizen Services and Innovative Technologies.

McClure spoke with Federal Computer Week prior to participating in a panel discussion on new models of government IT, in a March 21 event hosted by Cisco in downtown Washington, D.C.

In the past few years, more agencies have moved to the Federal Information Security Management Act's "Moderate" level, and federal security officers have grown increasingly more comfortable with the idea of having higher levels of security on their systems.

But the higher level, including intelligence and the classified space, demands more protected data, an area where the Defense Department and the intelligence community are still working to iron out the kinks and figure out how to move ahead, McClure said.

Ron Ross, a computer scientist at the National Institute of Standards and Technology, McClure said, described it best when using the analogy of a big, open suitcase to illustrate how the controls for computer security are being used. The way in which those standards are used for non-cloud systems is by picking controls and classes of controls that fit whatever is being tested.

However, for cloud, “we’ve gone into that suitcase, pulled out the controls that we think are very important for vendors and government to demonstrate they have in place to protect data, access and privacy,” McClure said.

FedRAMP has created a governmentwide consensus on what those controls should be for cloud. Each agency is interpreting that in their own way; some might require 500 controls, others only 200, he said, adding that understanding the reason for the variations and what can be done to get a common baseline is an important matter.

For agency CIOs, getting comfortable with how the testing is done and making risk-based decisions in the computer security area remain the largest challenges with cloud computing. The FISMA-based process is a risk-based course, “and government is pretty risk averse,” which often leads to over-applying security, McClure said.

“I think it’s time to have those conversations about what works best and in what situation, and can we agree at least on a baseline,” he said. “Then give agencies prerogative based upon their unique needs and systems environments and . . . create a common approach that saves lots of money [and] lots of time and brings consistency in how security is done in government.”

About the Author

Camille Tuutti is a former FCW staff writer who covered federal oversight and the workforce.

Cyber. Covered.

Government Cyber Insider tracks the technologies, policies, threats and emerging solutions that shape the cybersecurity landscape.


Reader comments

Fri, Mar 23, 2012 Wyatt Starnes

The last comment is on target, IMHO. FISMA 1.0 is old and badly needs to be replaced. FEDRamp is at best, ambiguous and incomplete. The best work continues to come out of NIST with close cooperation with other agencies and IC groups. Suggest you look at the draft update to 800-53 (version 4). This should server as framework for FISMA 2.0 and, in my opinion, is some of the best work I have seen on cyber security and IT infrastructure management. Link here:

Fri, Mar 23, 2012

Can a contractor who signs a SLA automatically adjust to a dynamic threat enviornment and APTs to protect sensitive information? Doubt it. If patching and encryption can't protect the information or system, they'll fall back on the "it wasn't in the SLA".

Fri, Mar 23, 2012

I'm afraid the statements made here are mistaken in most respects. FISMA and FEDRAMP aren't making internal government systems more secure or secure enough. They aren't securing a single thing, exactly because agencies get to choose how controls are interpreted, what they mean and what's required. And system owners are reporting false and inaccurate FISMA security information up the chain, out of ignorance, lack of talent, fear of reprisal, or to avoid work. A main goal of FISMA is to make government managers aware of security risks. But the management comments in this article, stating that FISMA is making systems secure, shows that FISMA has failed to make management aware of the risks on their systems.

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above

More from 1105 Public Sector Media Group