Can big data save the government $500 billion?
- By Frank Konkel
- Jun 17, 2013
Can big data help the government save $500 billion?
A survey of 150 federal IT executives conducted by Meritalk suggests big data has the potential right now to produce a smarter, more efficient government that could cut the $3.54 trillion federal budget by 14 percent, freeing up an extra $500 billion per year.
The survey findings are documented in a report called "The Smarter Uncle Sam: The Big Data Forecast." The eye-catching $500 billion figure comes from extrapolating what the 150 IT execs believe their respective agencies can save by successfully leveraging big data – a relatively new tech term meant to describe pulling insights from the analysis of large, sometimes seemingly unrelated data sets.
The report suggests a smorgasbord of savings in three main arenas:
- Managing the transportation infrastructure;
- Fighting fraud, waste and abuse;
- Executing military, intelligence, surveillance and reconnaissance missions.
"When they look at how they are using data and how it could be used in all these kinds of fields, there are a significant amount of dollars that can be saved there," said Rich Campbell, Chief Technologist at EMC Federal. EMC Federal underwrote the Meritalk report.
"The data analysis of trends and allocation of resources is at a much more molecular level now than ever before, and the potential is becoming more realized on a day-by-day basis," Campbell said. "We're seeing more use cases and newer technologies, and the government is moving ahead and tackling the low-hanging fruit out there right now."
Clear-cut use cases in big data are not common yet in the federal government, though some have been doing big data since before it carried that label. The National Oceanic and Atmospheric Administration, for example, frequently analyzes large data sets with supercomputers to predict for weather. In the intelligence community, the National Security Agency's recently leaked methods for Internet and phone records collections is another clear-cut big data use case.
The Meritalk survey states that about one-quarter of the federal IT executives interviewed had launched at least one big data initiative, which is promising given the relative newness of the term.
Big data today is akin to what cloud computing was to feds in 2009, and big data has yet to have a clear-cut definition. That one quarter of agencies have some kind of big data initiative – even if it isn't a full-fledged project or pilot – means the government has done more than take notice of big data.
"In the last two to three years, we've gone through this whole transformation effect – it's morphed into a commodity that an organization can leverage from places people didn't anticipate before," Campbell said. "Agencies are using their existing resources, in some cases, to spin up Hadoop clusters and allowing applications to tie in. The landscape is really beginning to change."
About 70 percent of the federal IT executives surveyed said they believed that five years from now, big data will be critical to fulfilling federal mission objectives. But how do agencies get up to speed on big data in such a short time period? Even now, many agencies have not maximized cloud computing to its potential, and it came about considerably sooner than big data.
Much as infrastructure-as-a-service and storage-as-a-service models have taken off in both the private and public sectors as organizations look to save money, Campbell said data analytics must be thought of in the same vein before agencies really put it to heavy use. Before the government gets there entirely though, old IT methods -- like large, long-term contracts with single contractors for proprietary solutions -- are going to have to go. Agencies are also going to have to spend considerable time decide how they want to use big data, and specifically decide which questions they want their data to answer.
"The cost model of these services is going to have to change," Campbell said. "I don't foresee massive long-term contracts anymore – I see agencies looking more for cheaper ways to do data analytics. The cost is already coming down, and the processes and people are only going to get more refined."
Frank Konkel is a former staff writer for FCW.