Security researchers seek clarity on legal protections in CISA vulnerability disclosure directive
- By Derek B. Johnson
- Dec 18, 2019
The public still has another 10 days to comment on a draft order crafted by the Cybersecurity and Infrastructure Security Agency directing federal civilian agencies to create their own vulnerability disclosure programs for internet-accessible systems and services.
The directive tasks each agency with developing and publishing its own vulnerability disclosure policy, enabling receipt of unsolicited vulnerability reports, maintaining supporting handling procedures for any vulnerability reports received and reporting certain metrics to CISA.
CISA put out a reminder in the form of a Federal Register notice set to publish Dec. 19 that it is still seeking comment on the draft order, but the agency has already received feedback from some notable names in the vulnerability research community.
Of the comments submitted to GitHub thus far, most indicate general support for the idea, but many also outline a number of suggested improvements and clarifications. Others question whether such programs are the best way to identify and mitigate the government's security vulnerabilities.
Amelie Koran, a senior technology advocate at Splunk, called for a reevaluation of CISA's directive and the idea of outsourcing the work of finding government vulnerabilities to private third-parties. Instead, Koran suggested that inspectors general are best positioned to carry out the work.
The Inspector General Empowerment Act "provides leverage and doesn't require more or other legislation or rule-making to enforce compliance outside of current authorities," Koran wrote. "IGs also typically have the knowledge and process in place over the many years of operations to know how to work within agency operations, frameworks and missions to accomplish what is suggested in this [directive]."
The most common issue raised revolves around how researchers will be legally protected when probing government systems and when they can take their findings to the public.
White hat hacker and noted bug bounty hunter Jack Cable, said agencies should adopt CISA's language for granting safe legal harbor to authorized researchers and develop a process for what happens when agencies don't fix their bugs.
CISA and the Office of Management and Budget "should plan for the case that agencies fail to adequately remediate reported vulnerabilities," Cable wrote. "This may be the first time that an agency's response capabilities are tested against real vulnerability reports, and not every agency will be successful in patching their systems effectively and timely.… CISA and other policy leadership should plan for when this happens and aid agencies to improve their cybersecurity processes and talent."
Companies often complain that publicly disclosing a vulnerability before it's been fixed can invite attacks, while researchers often counter that many organizations will drag their feet or sit on news of an internal vulnerability for public relations or other reasons, leaving their customers exposed. The industry has largely coalesced around an informal policy whereby researchers first inform the entity privately and give it 90 days to develop patching before the vulnerabilities are made public.
The draft directive does say that agencies should recognize that "a reporter or anyone in possession of vulnerability information can disclose or publish the information at any time" and directs them to craft policies that include "a commitment to not recommend or pursue legal action" for "good faith" security research activities.
A footnote recommends that agencies remediate within 90 days of being notified, but HackerOne founder and CTO Alex Rice requested it be elevated to the body of the main directive, "else I'd be concerned that they are easily missed or ignored."
Adam Bernstein, an IT management specialist at the Office of the Inspector General for the Department of Housing and Urban Affairs, said that many federal agencies rely on legacy systems that have known vulnerabilities that can't be fixed due to funding or resource constraints. Such vulnerabilities, he argued, should not be covered under a disclosure program.
"The agencies continue to operate the systems because they accept the security risk in order to not impede the agency mission. If the systems become non-operational during an attack, then the assumption is that appropriations will then be provided to mitigate the issue," Bernstein wrote. "These legacy and underfunded systems should never be a part of any vulnerability disclosure program because the discovery of more vulnerabilities without the ability for remediation will only further weaken the country's IT systems."
Allen Householder, a threat and vulnerability researcher at the Software Engineering Institute at Carnegie Mellon University, responded to Bernstein's comment by noting that "the vulnerabilities are already there and attackers can find them and won't tell you about them."
Derek B. Johnson is a senior staff writer at FCW, covering governmentwide IT policy, cybersecurity and a range of other federal technology issues.
Prior to joining FCW, Johnson was a freelance technology journalist. His work has appeared in The Washington Post, GoodCall News, Foreign Policy Journal, Washington Technology, Elevation DC, Connection Newspapers and The Maryland Gazette.
Johnson has a Bachelor's degree in journalism from Hofstra University and a Master's degree in public policy from George Mason University. He can be contacted at [email protected], or follow him on Twitter @derekdoestech.
Click here for previous articles by Johnson.