Big Data

Government can borrow from private sector’s big-data playbook

analytics concept art

What do Web giants like Facebook and Google have in common with advertising technology companies?

They’re both leaders in big data, with easily scalable IT infrastructures that allow for the collection of real-time systems data of disparate natures and sources, and in in-memory computing power to run big jobs fast.

This leads to faster actionable insights with all sorts of applications in civilian agencies, according to Carl Wright, executive vice president at MemSQL, a company that specializes in in-memory SQL databases.

These kind of next-generation infrastructures, predicated on commoditized scalable architectures, fill buildings with distributed frameworks of servers that cut down significantly on latency in big batch projects. Such systems process in minutes what legacy enterprise systems – those common at many government agencies – could take hours or days to do, Wright said. These systems are the reason a user’s Internet search term can quickly lead to user-specific ads.

“You have the ability to use every core and server to facilitate a load, so it creates a true high-performance computing environment,” Wright said.

Real-time systems data already plays a major role in how defense and intelligence agencies carry out their missions and daily operations. As some of the National Security Agency’s surveillance capabilities and data ingest techniques have come to light in recent months, it is clear that the intelligence community is far ahead in the big-data game.

Yet Wright said real-time systems have many potential benefits on the civilian side of government. The ability to compute more data at faster intervals gives an agency a better handle on its decision making, allowing it to use significantly more data over longer periods of time to arrive at conclusions. In arenas such as fraud detection and disease control, Wright said, various collections of data – whether unstructured, structured or semi-structured -- can be processed for real-time reporting.

Depending on the vendor, in-memory computing used in distributed frameworks can vary significantly in price, Wright said, but the fact that most government agencies are facing budget shortfalls might actually be a good thing for innovation. And some real-time systems can at least run on the front-end of existing legacy systems. The real revolution, Wright said, is on the back-end.

“The good news for change is that it means agencies can’t keep doing business as usual. They have to look for more economic systems and can’t keep going back to the same vendors locked in bad cost structures,” Wright said.

About the Author

Frank Konkel is a former staff writer for FCW.

Featured

  • FCW Perspectives
    human machine interface

    Your agency isn’t ready for AI

    To truly take advantage, government must retool both its data and its infrastructure.

  • Cybersecurity
    secure network (bluebay/Shutterstock.com)

    Federal CISO floats potential for new supply chain regs

    The federal government's top IT security chief and canvassed industry for feedback on how to shape new rules of the road for federal acquisition and procurement.

  • People
    DHS Secretary Kirstjen Nielsen, shown here at her Nov. 8, 2017, confirmation hearing. DHS Photo by Jetta Disco

    DHS chief Nielsen resigns

    Kirstjen Nielsen, the first Homeland Security secretary with a background in cybersecurity, is being replaced on an acting basis by the Customs and Border Protection chief. Her last day is April 10.

Stay Connected

FCW INSIDER

Sign up for our newsletter.

I agree to this site's Privacy Policy.