Why big data needs a soul
- By Mark Rockwell
- Apr 22, 2016
"Data is a story with soul," said Dr. Kristen Honey, a policy advisor in the White House's Office of Science and Technology Policy.
Honey, who appeared with a number of other experts on an April 22 AFFIRM panel on taming the explosion of government data, was quoting author and storyteller Brene Brown. But she and other panelists said much the same thing, stressing that the most effective tool for dealing with the oceans of data generated by federal agencies isn't technological, but human.
Managing data, she said, "is a skill set." That set not only includes technically adept personnel, but also those who can communicate with others effectively and network with users to work out what they're searching for in the petabytes of data federal agencies are unleashing. "Don't exclude the softer human skills" in forging teams to handle data operations, she said.
The ability to make sense of and assign datasets in relation to other valuable datasets and see the total potential, International Trade Commission CIO Kirit Amin said, is the key to linking and squeezing value out of big data for agencies and the public.
"It's an ecosystem" of technical personnel, who can use data administration tools, as well as employees who can see larger connections, Amin said. He suggested that if Health and Human Services data on health care fraud could be associated with Internal Revue Service's master death list, for instance, not only would health care fraud schemes be stopped, but the effort could potentially uncover trends that would stop future fraud.
Agencies must be able to see where there services might be of the most benefit for potential consumers and users, said David McClure, lead analyst for open government data services at the National Oceanic and Atmospheric Administration. NOAA, he said, is coming up on the one-year anniversary of its cooperative research and development agreement with commercial cloud providers to explore ways to make its deep ocean of weather data available through their facilities.
"The technology is not a limiting factor," McClure said. The issue is how to make its data available in ways that private industry can use it to make new services for the public.
Dr. Peter Aiken, an associate professor of information systems at Virginia Commonwealth University, said that luckily, the federal government is better at managing big data than the private sector. "We have federal laws, like Clinger-Cohen, that tell us we have to manage data. There is no private sector correlation."
Mark Rockwell is a senior staff writer at FCW, whose beat focuses on acquisition, the Department of Homeland Security and the Department of Energy.
Before joining FCW, Rockwell was Washington correspondent for Government Security News, where he covered all aspects of homeland security from IT to detection dogs and border security. Over the last 25 years in Washington as a reporter, editor and correspondent, he has covered an increasingly wide array of high-tech issues for publications like Communications Week, Internet Week, Fiber Optics News, tele.com magazine and Wireless Week.
Rockwell received a Jesse H. Neal Award for his work covering telecommunications issues, and is a graduate of James Madison University.
Click here for previous articles by Rockwell.
Contact him at firstname.lastname@example.org or follow him on Twitter at @MRockwell4.