Success story: DFAS spots bad payments with big data
- By Frank Konkel
- Dec 20, 2012
Big data has been a big topic of conversation for some time now, but many in the federal space still struggle with its definition, let alone its potential possibilities. Some agencies, however, are using big data and analytics to good effect already.
Case in point: The Department of Defense’s global shared service center, the Defense Finance and Accounting Service (DFAS) employs a business activity monitoring software tool provided by Oversight Systems that continuously scan DFAS’ vendor payments in an effort to identify and prevent improper transactions. That project has been cited by the administration’s PaymentAccuracy.gov as one of the government’s success stories in reducing improper payments.
The tool has saved DFAS more than $4 billion in improper payments to vendors over the past four years, according to Aubrey Vaughan, managing director of the public sector at Oversight Systems, which holds a recently-renewed contract with DOD. DFAS officials also spoke with FCW about the project, but were not cleared to comment on the record in time for publication.
The DFAS system – powered by a collaborative reasoning engine – automates the oversight of $359 billion allocated annually to vendors by DFAS, which makes vendor payments for DOD and various military branches. The tool sifts through terabytes of data and runs analytics against it, flagging fishy vendor payments and directing them to remediation, Vaughan said.
“The amount of people it would take to look at this amount of data processes is just not feasible,” Vaughan said. “It’s a big data solution that has saved taxpayers $4 billion in potential errors. That’s a lot of money in times of tough budgets.”
“The irony of all this is people are really talking about big data now, but we’ve been doing it – people have been doing it – for a while now,” Vaughan said.
Agencies, he said, have just begun to figure out how to maximize value from big data solutions, but he and others argue that old-hat legislative policies might limit the speed of progress.
SAP Public Services Vice President and Chief Innovation Officer David Robinson, speaking at a December executive briefing sponsored by FCW and TechAmerica , said officials should push to establish new policies that direct how big-data technologies will be used in the near future. It won’t happen overnight, he said, but just because technology innovation occurs faster than the government can draft policy is no reason not to get started.
Vaughan said a directive from the Office of Management and Budget might help spur faster policy changes.
“There are some low-hanging fruit that that agencies could use continuous modeling software for to see an immediate impact in savings,” Vaughan said. “You don’t see that legislation right now that says, ‘this is what you should be using big data for.’”
Frank Konkel is a staff writer covering big data, mobile, open government and a range of science/technology issues. Connect with him on Twitter at @Frank_Konkel.