By Steve Kelman

Blog archive

Big data, big interest

Image from

Josh Helms is an Army officer currently doing a one-year stint at the IBM Center for the Business of Government. The assignment is through the Army's Training with Industry program, and Helms is working on learning how industry is using "big data" and how the government can best use it. 

I recently retweeted a link from the IBM Center's John Kamensky (whom I first got to know during the Clinton administration when he was a GAO civil servant on detail to the "reinventing government" program) about the blogs Helms has been posting on what he's been learning. It was interesting to see that my own retweet got further retweeted and favorited a considerably higher number of times than my typical Twitter post. Clearly, this is a topic that interests lots of folks in and around the government.

Helms has written four blog posts so far -- the most recent on actual examples of how government is using big data -- and promises more to come.

The first message I got from the blogs was that -- have you heard this one before? -- it's not about the technology, stupid. It's about using IT to create value for an organization and its customers.

To be sure, there are a bunch of technologies that have made it possible for dispersed, and even unstructured, datasets to talk with each other in a way that makes analysis more possible. And Helms notes that organizations need to deal with data governance issues across datasets to allow analysis.

But he quotes an IBM exec as noting that "just having massive amounts of data is not sufficient -- in fact, without the proper analytic tools, it can lead to information overload." Indeed, the defining feature of big data, Helms argues, may not even be its bigness, but rather an organization's ability to use these new analytic tools even for smaller datasets.

As I read through the examples in Helms' latest post, the three words that come up most-often are "interoperability," "unstructured," and "analytics." The first two in significant measure are the domain of techies; the last is the domain of subject matter experts. In one of his examples, the National Center for Atmospheric Research has integrated data from utility companies, universities and government to improve predictions of supply and demand for power, while analysis of weather and atmospheric patterns enables more reliable and efficient production of renewable energy. The Social Security Administration is using very unstructured data in disability applications and medical treatment regimens, analyzed in the context of expected treatment courses, to identify fraudulent claims. DHS is developing more and more experience getting large numbers of crime and terrorism-related databases to talk with each other.

The big challenge for government, aside from the obvious ones of a tight funding environment for these new efforts, will be fostering the kind of collaboration between techies and program people so that each has a better feel for what the other needs, and can get big data efforts off the ground. Meanwhile, if smart government leaders -- whether on the tech or the program side -- are looking for ways to innovate and make a difference, big data, to use a phrase that comes from the area itself, is ripe for mining.

Posted by Steve Kelman on Apr 09, 2015 at 5:12 AM


  • Workforce
    White House rainbow light shutterstock ID : 1130423963 By zhephotography

    White House rolls out DEIA strategy

    On Tuesday, the Biden administration issued agencies a roadmap to guide their efforts to develop strategic plans for diversity, equity, inclusion and accessibility (DEIA), as required under a as required under a June executive order.

  • Defense
    software (whiteMocca/

    Why DOD is so bad at buying software

    The Defense Department wants to acquire emerging technology faster and more efficiently. But will its latest attempts to streamline its processes be enough?

Stay Connected