In-memory computing gaining momentum?

Technology advances and declining prices are sparking a "real-time data revolution," industry experts say.

memory chips

As the price of RAM has plummeted, in-memory computing is becoming an increasingly affordable addition to existing IT infrastructure.

With the world's total data volume of 2.8 zettabytes (that's 2.8 trillion gigabytes) expected to grow 20-fold by 2020, big-data technologies are certain to remain on the radar of the U.S. government, by far the planet's largest information producer and collector.

And while technologies designed to process and analyze large amounts of data are still in their relative infancy, major advances are already allowing analytics on massive amounts of data at real-time speed.

One such advancement, in-memory computing, is gaining momentum in the federal space, according to Michael Ho, vice president of Software AG Government Solutions.

In-memory computing processes data stored in memory as opposed to data coming from relational databases such as hard drives, drastically reducing compute time. Speed can be critical in big data jobs for federal agencies, but pulling data from various mainframes and servers – where governance issues and lag time are frequent obstacles – is certainly not conducive to quick results.

In-memory computing offers a solution to those problems, Ho told an FCW executive briefing in Washington on Oct. 24.

"In the same fashion with big data, we can leverage in-memory technologies now and take advantage of all that RAM," said Ho, adding that in-memory computing offers is a "real-time data revolution."

Ho said in-memory computing offers promise in situation awareness, cybersecurity defense, financial analysis and data transparency. The biggest public-sector consumers of in-memory technologies are the defense and intelligence agencies, so few cases in government have received much public attention. What is known, though, is that latency – even on large compute jobs – is typically reduced to milliseconds.

"You load all that data into the memory, and we can do things we weren't able to do before or things we'd have to wait minutes, hours or even days for," Ho said.

Ho said some of the hype attributed to in-memory computing is a result of economics. A gig of ram today costs a few bucks, compared with several thousand dollars in the 1960s. By 2025, Ho said, the cost for the same amount of computer memory will be down "to pennies." Many of the leading companies in the market offer in-memory systems that run on top of existing IT infrastructure, which bodes well for cash-conscious agencies.

And given how quickly the world's volume of data is expected to increase, an increase in the in-memory computing market makes even more sense, according to Joe Shaffner, director of Database and Technology Solutions Engineering at SAP.

"It's a big game-changer," Shaffner said.

Even with in-memory capabilities, however, achieving real-time results may not be possible yet for some government agencies -- including those within the defense community.

Lt. Col. Bobby Saxon, division chief and project director for the Army Enterprise Management Decision Support program office, said "data dumps" required for some missions still pull large amounts of data from legacy "mainframe" systems, leading to what he called "near real-time data." Sometimes though, Saxon said, you're "looking at yesterday's data."

NEXT STORY: Beware the mobile threat