Can the U.S. stop malware and buy it at the same time?
- By Derek B. Johnson
- Aug 09, 2017
The federal government is the world's largest purchaser of malware. But the recent arrest of British hacker and information security researcher Marcus Hutchins by U.S. federal authorities reignited a long-simmering debate about how the government regulates malware, exploits and other software vulnerabilities.
The National Security Agency, the FBI and elements inside the Department of Homeland Security are responsible for discovering flaws in existing software, alerting private companies and the public of their vulnerability and prosecuting malicious actors who create and sell these vulnerabilities on the black and grey markets.
On the other hand, the U.S. government itself is, by most accounts, the largest developer and buyer of malware, exploits and software vulnerabilities in the world today. Some of these vulnerabilities are disclosed to the public and the tech industry to patch and fix, some are used for purposes of detection in cyber-protection systems like DHS's Einstein, others are kept secret by intelligence agencies for surveillance and cyber espionage.
The tensions created by these dual missions have raised questions in the tech industry and among policymakers and independent watchdogs about how the government balances these competing and sometimes contradictory priorities.
Curtis Dukes, former director of the NSA's information assurance directorate and current executive vice president at the Center for Internet Security, told FCW that the intelligence community must weigh the value of a particular exploit against the harm that that it could cause to citizens and businesses who are left vulnerable.
"We try to balance the risk to the nation and to our allies with the ability to prosecute our intelligence-gathering mission," said Dukes. "It's our job to make that decision as best we can."
It's unclear just how much the government spends purchasing or developing such malware tools, since much of this spending is classified. A report by the Washington Post found that the NSA alone spent upwards of $25 million on malware in 2013. One contractor speaking to FCW on background estimated that the government spends somewhere north of $100 million per year on such products. Almost no one disputes the claim that the U.S. government is the biggest player in the malware market.
There are some substantive differences between the software developed and purchased by the government and the kind of malware that Hutchins is accused of selling. According to Johannes Ullrich, dean of research at SANS, many of the malware tools used by the government are developed using in-house resources. When the government lacks sufficient expertise to make the tools itself, it will often buy them through existing IT contracts.
Rarely if ever will the government purchase these tools directly on the black market, though it is possible that third-party contractors who sell such tools to the government might.
To Ullrich, this process as well as the bureaucratic nature of the government is sufficient to sidestep any potential entanglement between its intelligence and criminal justice missions regarding malware.
"I don't really think the government is benefiting that much from the illegal malware-for-sale market, he said. "With the FBI, for example, investigating the criminal part of it while NSA is doing the offensive part, there's enough compartmentalization there to avoid a conflict."
Distinctions are also made around intent. The malware allegedly sold by Hutchins was a Trojan virus to collect and exploit banking information. According to a cyber contractor who tests malware for the government and develops and tests security tools, the kind of profit-driven malware often peddled on the black and grey markets typically isn't useful for the kind of offensive cyber operations carried out by national security and surveillance agencies.
"I don't think it's inconsistent for [the government] to be prosecuting criminals. That being said, there are tensions between [different] groups that are finding vulnerabilities to target terrorists and those who would rather use it to secure those systems," said Jason Syverson, CEO of Siege Technologies.
However, Syverson is uncomfortable with the prosecution of a developer like Hutchins based on how a third-party chose to use his product.
"I do have a concern as someone from the community, if he just wrote code that was used maliciously and the government is saying he should be prosecuted, when his code probably doesn't look that differently from the code we use," Syverson said. "Because it's like someone developing a car and advertising that you could use it run someone over. But it's still a car."
The private sector also plays a role, instituting "bug bounties" that pay information security researchers who discover software vulnerabilities. However, experts believe the black market continues to be far more lucrative, and companies including Google have shut down their bounty programs due to a lack of interest.
The government is increasingly getting into the bug bounty game, but awards from such programs pale to what is available on the gray market.
Laura Gerhardt, a developer at the Technology Transformation Service, told FCW on the sidelines of an industry event that the government should pay researchers and white hat hackers in accordance with the importance of the vulnerabilities they discover.
"The notion behind bug bounties is triaged tiers of vulnerability. So there are three tiers commensurate with the bug bounty program in a way that incentivizes researchers. If it's just one line of code, they shouldn't be paid out the nose for it, but as the vulnerability increases, so should the compensation," she said.
To Ullrich, once the intelligence community loses exclusive access to knowledge of a particular bug or software exploit, it should immediately shift to an information assurance posture.
"In my opinion, [the government does] have a duty to disclose this to a manufacturer once they see these exploits being used by third parties," he said.
Hutchins, who lives in the United Kingdom and works for Los Angeles-bases security firm Kryptos Logic, discovered a secret kill-switch in the code of the WannaCry ransomware that helped slow the spread of the destructive attack. The WannaCry attack is thought to have been derived from exploits stolen from the National Security Agency by the hacker group Shadow Brokers.
Rick Ledgett, who served as deputy director of the NSA from 2014-2017, denied that the government hoards exploits and vulnerabilities in a post on the legal blog Lawfare.
"Most of the vulnerabilities discovered by the U.S. government are disclosed, and at the National Security Agency the percentage of vulnerabilities disclosed to relevant companies has historically been over 90 percent," Ledgett wrote.
Dukes said the U.S. relies at least in part on having exclusive knowledge of the technical vulnerabilities present in software systems to conduct surveillance.
"You have to understand that access is always going to be at the heart of the intelligence community's business," said Dukes. "They can't be successful if they can't get access to intel that decision-makers in this country need, so that's always going to play front and center [in their decision making]."
Dukes declined to comment specifically on the WannaCry and other incidents, but he acknowledged that the government and private sector could do more to increase accountability.
"Governments that retain vulnerabilities for national security purposes have a responsibility to adequately safeguard them," he said.
This story was updated Aug. 24 to correct Curtis Dukes' job title.