From ELC: Risk management takes center-stage
- By Wyatt Kash
- Oct 27, 2008
Agencies seeking to secure federal information systems need to focus on attack-based metrics as one of several approaches to reduce vulnerabilities and better manage risks, according to a group of government and industry officials.
The escalating scale of threats on agencies’ IT systems continues to put new pressures on government to find better ways to manage risks. But of three core factors commonly associated with determining risk assessments—threats, vulnerability and impact—the only factor that can really be managed and reduced is vulnerability, said Alan Paller, director of research of the SANS Institute.
Paller, speaking at a panel on managing risks at the ACT-IAC Executive Leadership Conference today in Williamsburg, Va., said that current government guidance on assessing IT system security leaves too much room for interpretation. That in turn breeds uncertainty.
“Uncertainty causes wasteful wars between inspectors general and chief information officers,” Paller said.
The better approach, according to Paller, would borrow measures now in use by the banking industry that deal with many of the same IT security concerns. Specifically, Paller recommended that agencies:
- Engineer to block “known bads.”
- Use procurement to buy security “baked in.”
- Continually monitor and fix vulnerabilities to known attack vectors.
- Monitor new attacks that are critical vulnerabilities daily.
- Innovate to block new attacks.
- Automate to make old defenses inexpensive and channel spending on the right things.
- Ensure business continuity and effective incident response.
Central to executing these steps, however, is the need for agencies to embrace the notion that “defense must be informed by offense,” with IT security teams staffed by people who have lived through cyberattacks, Paller added. And, he said, those teams need to focus on “attack-based metrics” in prioritizing risk response measures.
Agencies have a long way to go to get to that point, he said, citing that one recent estimate that at a typical agency, “70 percent of the (IT security) staff has soft skills; only 30 percent have specialized security skills,” stressing the urgency to reverse that ratio.
Paller was followed by a group of panelists, speaking on a non-attribution basis, that included Sallyanne Harper, chief financial officer and chief administrative officer of the Government Accountability Office; Cathleen Berrick, director, Homeland Security and Justice, Government Accountability Office; Karen Evans, administrator, Office of E-Government and Information Technology, Office of Management and Budget; Gregory H. Friedman, inspector general, Energy Department; and Erik Hopkins, professional staff, Senate Homeland Security and Governmental Affairs Committee.
Among other measures they recommended, or said are still needed, to improve risk management efforts at agencies:
- Complete the transition to having government desktop computers fully compliant with Federal Desktop Core Configuration standards. “Offense has to know what the environment is to work on the defense,” said one panelist.
- Tap the expertise of GAO analysts and IGs who have had the opportunity to see and share best practices in risk management.
- Further the discipline of risk management through improved training and exposure to commercial world risk management experts.
- Involve the people actively involved in cyber security in higher decision making circles.
- Be careful not to become a risk-adverse government in the process of mitigating risks.
Wyatt Kash served as chief editor of GCN (October 2004 to August 2010) and also of Defense Systems (January 2009 to August 2010). He currently serves as Content Director and Editor at Large of 1105 Media.