Big data: How much is enough?
Big data may be all the rage, but the federal government often struggles with handling vast volumes of information and needs better analysis to make sense of it all, according to a General Services Administration top official.
Sometimes “we think need more data ,but frankly, what we need is more analysis and better tools to conduct the analysis,” said Daniel Tangherlini, GSA’s acting administrator. He spoke at a July 17 Tech Town Hall Meeting in McLean, Va., organized by the American Council for Technology and Industry Advisory Council and TechAmerica.
"Big data" is a trend, but Tangherlini warned against collecting more and more data just to do it.
“In some cases, we do need more data,” Tangherlini acknowledged, but stressed the attention should be on reducing the data collection burden and the processes with which the information is collected, while increasing the value of the actual data.
Tangherlini, who was appointed April 2 to lead the agency after the fallout of the Western Regions spending debacle, was taking questions from the audience, responding to a comment that the government often asks the private sector for data without specifying its purpose or intent.
To aggregate spending and drive down costs, Tangherlini said GSA needs to work with industry partners to come up with innovative solutions. “What we’ve seen is a real transformation in the types of tools that are available to us,” he said.
One of the biggest challenge, he continued, is making the transition from antiquated processes that have been used for the last two decades and move into the new environment of tomorrow.
“We have this huge opportunity around transformation,” Tangherlini said.
Camille Tuutti is a former FCW staff writer who covered federal oversight and the workforce.