Big Data

In-memory computing gaining momentum?

memory chips

As the price of RAM has plummeted, in-memory computing is becoming an increasingly affordable addition to existing IT infrastructure.

With the world's total data volume of 2.8 zettabytes (that's 2.8 trillion gigabytes) expected to grow 20-fold by 2020, big-data technologies are certain to remain on the radar of the U.S. government, by far the planet's largest information producer and collector.

And while technologies designed to process and analyze large amounts of data are still in their relative infancy, major advances are already allowing analytics on massive amounts of data at real-time speed.

One such advancement, in-memory computing, is gaining momentum in the federal space, according to Michael Ho, vice president of Software AG Government Solutions.

In-memory computing processes data stored in memory as opposed to data coming from relational databases such as hard drives, drastically reducing compute time. Speed can be critical in big data jobs for federal agencies, but pulling data from various mainframes and servers – where governance issues and lag time are frequent obstacles – is certainly not conducive to quick results.

In-memory computing offers a solution to those problems, Ho told an FCW executive briefing in Washington on Oct. 24.

"In the same fashion with big data, we can leverage in-memory technologies now and take advantage of all that RAM," said Ho, adding that in-memory computing offers is a "real-time data revolution."

Ho said in-memory computing offers promise in situation awareness, cybersecurity defense, financial analysis and data transparency. The biggest public-sector consumers of in-memory technologies are the defense and intelligence agencies, so few cases in government have received much public attention. What is known, though, is that latency – even on large compute jobs – is typically reduced to milliseconds.

"You load all that data into the memory, and we can do things we weren't able to do before or things we'd have to wait minutes, hours or even days for," Ho said.

Ho said some of the hype attributed to in-memory computing is a result of economics. A gig of ram today costs a few bucks, compared with several thousand dollars in the 1960s. By 2025, Ho said, the cost for the same amount of computer memory will be down "to pennies." Many of the leading companies in the market offer in-memory systems that run on top of existing IT infrastructure, which bodes well for cash-conscious agencies.

And given how quickly the world's volume of data is expected to increase, an increase in the in-memory computing market makes even more sense, according to Joe Shaffner, director of Database and Technology Solutions Engineering at SAP.

"It's a big game-changer," Shaffner said.

Even with in-memory capabilities, however, achieving real-time results may not be possible yet for some government agencies -- including those within the defense community.

Lt. Col. Bobby Saxon, division chief and project director for the Army Enterprise Management Decision Support program office, said "data dumps" required for some missions still pull large amounts of data from legacy "mainframe" systems, leading to what he called "near real-time data." Sometimes though, Saxon said, you're "looking at yesterday's data."

About the Author

Frank Konkel is a former staff writer for FCW.

FCW in Print

In the latest issue: Looking back on three decades of big stories in federal IT.


  • Anne Rung -- Commerce Department Photo

    Exit interview with Anne Rung

    The government's departing top acquisition official said she leaves behind a solid foundation on which to build more effective and efficient federal IT.

  • Charles Phalen

    Administration appoints first head of NBIB

    The National Background Investigations Bureau announced the appointment of its first director as the agency prepares to take over processing government background checks.

  • Sen. James Lankford (R-Okla.)

    Senator: Rigid hiring process pushes millennials from federal work

    Sen. James Lankford (R-Okla.) said agencies are missing out on younger workers because of the government's rigidity, particularly its protracted hiring process.

  • FCW @ 30 GPS

    FCW @ 30

    Since 1987, FCW has covered it all -- the major contracts, the disruptive technologies, the picayune scandals and the many, many people who make federal IT function. Here's a look back at six of the most significant stories.

  • Shutterstock image.

    A 'minibus' appropriations package could be in the cards

    A short-term funding bill is expected by Sept. 30 to keep the federal government operating through early December, but after that the options get more complicated.

  • Defense Secretary Ash Carter speaks at the TechCrunch Disrupt conference in San Francisco

    DOD launches new tech hub in Austin

    The DOD is opening a new Defense Innovation Unit Experimental office in Austin, Texas, while Congress debates legislation that could defund DIUx.

Reader comments

Mon, Oct 28, 2013

I need help with my in-human memory too!

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above

More from 1105 Public Sector Media Group