Add 'value' to the Vs of big data

memphis police badge

The Memphis Police Department saw violent crime drop dramatically over a span of five years following the launch of Blue CRUSH, a predictive analytics system using data from cameras and police records to predict future crime.

Big data has traditionally been described by a series of words that begin with the letter ‘V’: volume, velocity, variety, and veracity.

The newest addition to that list is “value,” a term gaining critical importance in times of tough budget constraints and pressure on agencies to better analyze the data they have, said Dante Ricci, director of federal innovation at SAP.

“What agencies should be thinking of is, ‘What value could come out of it,’ not just creating a big data solution for the sake of big data,” Ricci said. “Big data is about the value you get out of it. It’s not tough to gauge the outcome (of big data investments) if you understand what the use case is going to be.”

Clay Richardson, senior analyst for Forrester Research, said agencies he works with are “trying to connect big data to get more value” by connecting data to operations.

The five Vs

Volume: The size of the amount of data generated that must be analyzed.

Velocity: How quickly data is being produced, received, processed, changed and understood.

Variety: Structured and unstructured data generated by a wide range of sources.

Veracity: The quality and lineage of the data.

Value: The end result or bottom line of a big data initiative.

In essence, Richardson said agencies are seeing success by making big data analytical processes “frictionless,” allowing data to be collected, analyzed, disseminated and dispersed to clients or customers with ease.

“A big piece of value is agencies trying to connect the big data to their core processors and operational processes so they can make changes to processes as data chances,” Richardson said. “So as more data and information comes in from the government, agencies are able to make quicker changes, and we’re definitely seeing that in some agencies.”

Data that can be analyzed and dispersed quickly helps an agency’s bottom line, Richardson said, and it keeps customers who use the data happy.

“A lot of what agencies want to understand is how data flows, who it goes to, who consumes it, why they need it, and agencies want to find how we infuse more intelligent and smarts in data and make it more consumable across the organization,” Richardson added. “Right now, a lot of the data is dumb – it isn’t smart in the sense that you’re not going to do anything with it other than build reports or a bit of analysis. Customers want more value out of the data.”

Some agencies already boast successful big data initiatives despite the relative newness of the technology.

The Internal Revenue Service used big data analytics to fix tax-filing errors in 2012, saving approximately $100 million in erroneous claims. The Department of Defense’s global shared service center, the Defense Finance and Accounting Service (DFAS), has saved approximately $4 billion in improper vendor payments since 2008 using a business activity monitoring software tool that continuously monitors several terabytes of Department of Defense transactions.

Dollars and cents are not the only currency for measuring value. Big data initiatives have been undertaken successfully at local and state levels, Ricci said, in some cases saving lives through programs at hospitals or by law enforcement agencies.

Violent crime in Memphis, Tenn., dropped 31 percent from 2006 to 2011 following the launch of Memphis Police Department’s Blue CRUSH (Crime Reduction Utilizing Statistical History) pilot program in partnership with the University of Memphis. The predictive analytics tool combined data from surveillance cameras, crime records and numerous other sources and used it to predict crime hotspots and to provide officers with real-time information about suspects and victims.

A study by Nucleus Research showed the city of Memphis spent about $400,000 per year on the initiative, including personnel costs, and calculated a $7 million return on investment, or 860 percent.

“If you’re able to save lives because of better data and situational awareness, that’s invaluable,” Ricci said.

About the Author

Frank Konkel is a former staff writer for FCW.

The Fed 100

Save the date for 28th annual Federal 100 Awards Gala.


  • computer network

    How Einstein changes the way government does business

    The Department of Commerce is revising its confidentiality agreement for statistical data survey respondents to reflect the fact that the Department of Homeland Security could see some of that data if it is captured by the Einstein system.

  • Defense Secretary Jim Mattis. Army photo by Monica King. Jan. 26, 2017.

    Mattis mulls consolidation in IT, cyber

    In a Feb. 17 memo, Defense Secretary Jim Mattis told senior leadership to establish teams to look for duplication across the armed services in business operations, including in IT and cybersecurity.

  • Image from

    DHS vague on rules for election aid, say states

    State election officials had more questions than answers after a Department of Homeland Security presentation on the designation of election systems as critical U.S. infrastructure.

  • Org Chart Stock Art - Shutterstock

    How the hiring freeze targets millennials

    The government desperately needs younger talent to replace an aging workforce, and experts say that a freeze on hiring doesn't help.

  • Shutterstock image: healthcare digital interface.

    VA moves ahead with homegrown scheduling IT

    The Department of Veterans Affairs will test an internally developed scheduling module at primary care sites nationwide to see if it's ready to service the entire agency.

  • Shutterstock images (honglouwawa & 0beron): Bitcoin image overlay replaced with a dollar sign on a hardware circuit.

    MGT Act poised for a comeback

    After missing in the last Congress, drafters of a bill to encourage cloud adoption are looking for a new plan.

Reader comments

Thu, Jan 3, 2013 Doug Laney Chicago

Clever and cute to keep adding V's, but only volume, velocity and variety are *definitional*. The other 12 characteristics of information Gartner defined years ago are important qualities of any data, but do not define Big Data. For future reference and proper attribution, see the original piece I wrote first introducing the 3Vs over 12 years ago at Gartner/Meta: --Doug Laney, VP Research, Gartner, @doug_laney

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above

More from 1105 Public Sector Media Group