Big data is not just a job, it's an adventure
- By Mark Rockwell
- Dec 11, 2014
For federal agencies, making data accessible is a bit like parenting: No matter how much you do, there's always more to be done.
During a Dec. 11 AFCEA-sponsored panel discussion in Bethesda, Md., on how to use big data effectively, data officers from several large federal agencies said the amount of data they make available for outside consumption is increasing, while the complexity and the challenges presented by the data shift constantly.
"You're never done" with the job of gathering and facilitating access to agency datasets, said Daniel Morgan, chief data officer at the Transportation Department. "There's always new data."
Federal agencies are handling the tidal wave of data differently, depending on the audience.
Donna Roy, executive director of the Department of Homeland Security's Information Sharing Environment Office, said DHS is implementing four "data lakes" that will take in and store data from DHS components regardless of format.
The approach can reduce upfront costs and make data more widely available within the organization or to outside stakeholders, depending on the data's sensitivity. Roy said the approach could reduce the current heavy "janitorial" workload to clean up data and make it useful across the organization.
Morgan said hundreds of local and state agencies must submit data on a daily basis to DOT from myriad sources, such as crash sites and road sensors, for both public and internal use. The department created an interagency dashboard that shows how states are performing in their data submissions to the agency.
The advent of autonomous vehicles will complicate DOT's job, and deciding how to ingest and support all the data generated by those vehicles is a looming challenge for federal policy-makers.
Roy said the National Information Exchange Model, the Homeland Security Information Network, and Identity, Credential and Access Management are enabling major information integration across more than 20,000 federal, state and local law enforcement agencies, which means DHS is somewhat ahead of the game in making large datasets work in a variety of environments.
The department's newest program, the DHS Data Framework, will also help, she said. The framework is a scalable IT program with built-in capabilities to support advanced data architecture and governance processes. It has built-in privacy protections and enables a more structured use of existing homeland security-related information across the organization.
Mark Rockwell is a senior staff writer at FCW, whose beat focuses on acquisition, the Department of Homeland Security and the Department of Energy.
Before joining FCW, Rockwell was Washington correspondent for Government Security News, where he covered all aspects of homeland security from IT to detection dogs and border security. Over the last 25 years in Washington as a reporter, editor and correspondent, he has covered an increasingly wide array of high-tech issues for publications like Communications Week, Internet Week, Fiber Optics News, tele.com magazine and Wireless Week.
Rockwell received a Jesse H. Neal Award for his work covering telecommunications issues, and is a graduate of James Madison University.
Click here for previous articles by Rockwell.
Contact him at email@example.com or follow him on Twitter at @MRockwell4.