Big data in 2013: Not just for big projects

Rather than take on big data for big data’s sake, agencies should be strategic in their approach, writes Bill Cull.

Bill Cull

No organization in the world has more data than the U.S. government. From real-time weather information to the tracking of swipes of Common Access Cards at federal buildings, the government has vast amounts of data at its disposal that can be collected, sorted and analyzed for purposes that include economic development and cybersecurity.

The presidential memorandum “Building a 21st Century Digital Government” released in May calls for a larger movement to “unlock the power of government data to spur innovation across our nation and improve the quality of services for the American people.” However, with tightening budgets and an uncertain fiscal future, agencies might be reluctant to hop on the big-data bandwagon, and understandably so.

With the start of a new year, several agencies will face increasing pressure to incorporate big data into their larger IT strategies. Investing in big data can sound like a tall order, but the trick is to dip your toe in the water before jumping in headfirst.

There is no specific formula for tackling big-data challenges in government. Each agency is unique and has its own set of challenges to solve. Rather than take on big data for big data’s sake, agencies should be strategic in their approach.

As the TechAmerica Foundation Big Data Commission points out in its recent report “Demystifying Big Data: A Practical Guide to Transforming the Business of Government,” agencies should identify a few key business or mission requirements that big data can address but understand the “art of the possible” before making any substantial investments. There are often several opportunities to deploy small-scale big-data projects and build additional larger-scale investments after the success of those pilot programs.

Agencies should ask themselves a few key questions as part of the cost/benefit analysis process: Will this big-data project help achieve one of our critical business objectives? Do we have the resources and capabilities needed to take on a project of this scale? Do we have access to the datasets necessary to make the most of this project? Can it be completed in a timely manner?

Although big-data projects will eventually pay for themselves by saving time and resources, federal, state and local CIOs should also consider technologies that reap benefits beyond their agencies’ bottom line.

Big data is not a fad technology. It’s transformational.

For example, one of the most significant challenges the National Security Agency faces is the ability to provide vast amounts of real-time data gathered from intelligence agencies, military branches and other sources to authorized users based on different access privileges.

Since July 2009, NSA has used the open-source Apache Hadoop platform for a massive, nationwide system for sharing and analyzing data. Hadoop allows agencies to tap huge, distributed data sources and is leading to the adoption of advanced big-data solutions.

NSA and other agencies, including the CIA, are now turning to data warehousing, mining and visualization tools that integrate with Hadoop. Those solutions manage and analyze data for the entire Hadoop environment. By integrating Hadoop with other big-data technologies, agencies are better equipped to make critical decisions in real time.

Big data is not a fad technology. It’s transformational. Instead of waiting to see how big data settles into the federal IT community, agencies should embark on small big-data projects to test the technology themselves.

As the TechAmerica commission notes in its report, agencies should not think of big data as an IT solution that solves reporting and analytical problems but rather as a strategic asset that can help achieve missions. There are several efficiencies to be gained as budgets continue to shrink, and investing in big-data projects will serve as an added layer to executing an agency’s strategic goals.