Big data needs governance
- By Frank Konkel
- Sep 26, 2013
For federal agencies to truly make big data a big deal, data governance will be every bit as important as the technology itself.
The U.S. government has always applied governance to its records – policies on who can view them, how they are shared and stored and managed – but for most of its 237-year history, those records were paper-based and relatively manageable in size.
The amount of data the government collects in the digital age is growing exponentially thanks to technology, but without proper governance it's not going to be used anywhere close to optimally, according to John Dvorak, section chief for the Federal Bureau of Investigation's Special Technologies and Applications Division.
Unfortunately, for feds trying to harness big data, governance poses potentially more difficult challenges than either technology or strategy.
"The governance side of things is very difficult," said Dvorak, speaking at a big data conference held in Washington, D.C., on Sept. 26.
Dvorak alluded to the challenges inherent in the FBI's data-sharing efforts.
On one hand, the agency works with the intelligence community on national security matters, but it also works with law enforcement on various levels. Employees and analysts "wear different hats" daily, and the staggering amount of data available isn't always structured and tagged. The FBI, just like many civilian agencies, is wary about what it shares, even with other feds.
"The hardest part for us is sharing data outside. We want to share more data with people," Dvorak said. "You have to have a governance process that as data is created, it is properly marked. How data is shared and properly marked is all brand new thinking in this world. When we hand data off to [another federal agency], we want to have a mutual agreement on how to protect that data."
Dirk Rankin, chief technology officer for the National Counterterrorism Center, said one of the problems with big data is that it's frequently described quantitatively. Techies use words like "volume" and "velocity" to describe big data, but qualitative descriptions are more beneficial for feds looking to maximize the value they get from big data, he said.
Correlations and key insights made from disparate groups of data within government are not happening as fast as they need to be, and better data governance could improve investment returns.
"My emphasis is on driving value out of the data we have and making it usable to analysts and people paid to make judgments based on that data," Rankin said.
Frank Konkel is a staff writer covering big data, mobile, open government and a range of science/technology issues. Connect with him on Twitter at @Frank_Konkel.