Management

Big data needs governance

rule book

For federal agencies to truly make big data a big deal, data governance will be every bit as important as the technology itself.

The U.S. government has always applied governance to its records – policies on who can view them, how they are shared and stored and managed – but for most of its 237-year history, those records were paper-based and relatively manageable in size.

The amount of data the government collects in the digital age is growing exponentially thanks to technology, but without proper governance it's not going to be used anywhere close to optimally, according to John Dvorak, section chief for the Federal Bureau of Investigation's Special Technologies and Applications Division.

Unfortunately, for feds trying to harness big data, governance poses potentially more difficult challenges than either technology or strategy.

"The governance side of things is very difficult," said Dvorak, speaking at a big data conference held in Washington, D.C., on Sept. 26.

Dvorak alluded to the challenges inherent in the FBI's data-sharing efforts.

On one hand, the agency works with the intelligence community on national security matters, but it also works with law enforcement on various levels. Employees and analysts "wear different hats" daily, and the staggering amount of data available isn't always structured and tagged. The FBI, just like many civilian agencies, is wary about what it shares, even with other feds.

"The hardest part for us is sharing data outside. We want to share more data with people," Dvorak said. "You have to have a governance process that as data is created, it is properly marked. How data is shared and properly marked is all brand new thinking in this world. When we hand data off to [another federal agency], we want to have a mutual agreement on how to protect that data."

Dirk Rankin, chief technology officer for the National Counterterrorism Center, said one of the problems with big data is that it's frequently described quantitatively. Techies use words like "volume" and "velocity" to describe big data, but qualitative descriptions are more beneficial for feds looking to maximize the value they get from big data, he said.

Correlations and key insights made from disparate groups of data within government are not happening as fast as they need to be, and better data governance could improve investment returns.

"My emphasis is on driving value out of the data we have and making it usable to analysts and people paid to make judgments based on that data," Rankin said.

About the Author

Frank Konkel is a former staff writer for FCW.

Featured

  • Cybersecurity

    DHS floats 'collective defense' model for cybersecurity

    Homeland Security Secretary Kirstjen Nielsen wants her department to have a more direct role in defending the private sector and critical infrastructure entities from cyberthreats.

  • Defense
    Defense Secretary James Mattis testifies at an April 12 hearing of the House Armed Services Committee.

    Mattis: Cloud deal not tailored for Amazon

    On Capitol Hill, Defense Secretary Jim Mattis sought to quell "rumors" that the Pentagon's planned single-award cloud acquisition was designed with Amazon Web Services in mind.

  • Census
    shutterstock image

    2020 Census to include citizenship question

    The Department of Commerce is breaking with recent practice and restoring a question about respondent citizenship last used in 1950, despite being urged not to by former Census directors and outside experts.

Stay Connected

FCW Update

Sign up for our newsletter.

I agree to this site's Privacy Policy.