Data exchange and cybersecurity
- By Derek B. Johnson
- Feb 05, 2020
The federal government plays a big role setting standards and controls around technology interoperability. The National Institute of Standards and Technology sets out mandatory guidance for federal agencies, and that guidance often becomes best practice for private industry. Other groups, like MITRE and the Open Cybersecurity Alliance, also develop frameworks and open source coding tools to facilitate better data interchange between software systems.
Kent Landfield, chief standards and tech policy strategist at McAfee, said the frameworks provided by NIST and others are only useful in an environment where private vendors don't see interoperability with other products through the prism of competition. Too many companies opt to create their own siloed software ecosystems, refusing to take into consideration how the plumbing in their products might operate within a more diverse information environment. Instead of fighting to provide the best customer service or user experience, many are instead opting to make it harder for their products to communicate with systems designed by competitors.
"Over the years, vendors got into this [mindset of] 'my secret sauce is better than your secret sauce and so I'm not going to cooperate with you,'" Landfield said at a Feb. 4 Center for Strategic and International Studies event.
Culture and structure might play an even more crucial role than technology when it comes to successful interoperability. Michael Daniel, former White House cyber coordinator and current president of the non-profit Cybersecurity Threat Alliance, said the technological infrastructure CTA uses to share threat information among members was set up relatively easily in a few months with "a couple hundred thousand dollars." The business rules, bylaws and other guidance needed to run the operation took "two years and several million dollars of lawyering."
Even small variances among different actors can throw a system off. The Alliance uses the same language for threat reporting, the Structured Threat Information eXpression, as the Department of Homeland Security. While STIX is supposed to provide a common terminology for sharing threat data, Daniel recounted one instance where reporting indicated a new strain of malware that was shooting up the charts. Upon closer inspection, it turned out one organization was simply tagging all new malware variants it discovered under the same default name: malware.generic.
"The hard part was not the technology, the hard part was the soft stuff and the culture to go along with it," Daniel said.
Last year NIST released an interoperability framework to guide software developers as they build tools to process and analyze large amounts of data. The guidance was developed in partnership with more than 800 experts from government, industry and academia and can be applied to a wide variety of data science projects, from physics experiments and telescopes to processing and analyzing sensor data culled from internet-of-things devices.
As it studied the issue, NIST chief cybersecurity advisor Donna Dodson said her organization learned that how you communicate to different audiences and industries matters. For cybersecurity professionals, the acronym "PAC" immediately calls to mind "physical access control." However, health care organization often used it to refer to picture archive systems. Dodson called it an example of how interoperability is about more than crafting solid technical guidance.
"I have to have that policy interoperability and understanding with language first and foremost before I can have a good dialog about how we would … help protect radiological kinds of environments in health care," she said. "I can't expect them to come to my world."
Even with the current software market appearing hopelessly fragmented, Daniel expressed optimism about the future, saying the internet and IT ecosystem will only get better over time.
"Cyberspace as we think of it is probably 25-30 years old; in policy and legal terms that's like nothing," Daniel said. "We've had hundreds of years to develop policy in other areas and we're still struggling with it, so it really shouldn't be surprising that's the case here."
Derek B. Johnson is a senior staff writer at FCW, covering governmentwide IT policy, cybersecurity and a range of other federal technology issues.
Prior to joining FCW, Johnson was a freelance technology journalist. His work has appeared in The Washington Post, GoodCall News, Foreign Policy Journal, Washington Technology, Elevation DC, Connection Newspapers and The Maryland Gazette.
Johnson has a Bachelor's degree in journalism from Hofstra University and a Master's degree in public policy from George Mason University. He can be contacted at [email protected], or follow him on Twitter @derekdoestech.
Click here for previous articles by Johnson.