What you don’t know can hurt you
- By John Verver
- Mar 18, 2014
The rise of big data comes as no surprise to auditors, who have long analyzed massive volumes of data to gain insight into the workings of organizations, writes John Verver.
If you’re in an agency leadership role, you are no doubt aware of the unprecedented challenges of controlling costs while simultaneously creating value for the taxpaying public.
Regardless of department, chances are that you are under immense pressure to demonstrate outstanding performance -- not only in terms of efficiency and effectiveness, but also in managing public money with competency and accountability. To navigate this era of reduced resources, you’re expected to focus relentlessly on operational efficiency, cost effectiveness, productivity, service and innovation. At the same time, scrutiny of public agencies is at an all-time high. For example, the Department of Defense is now statutorily required to withstand scrutiny by becoming audit ready by 2017.
The spirit of audit readiness is fast becoming best practice for government organizations around the world -- to drive down costs, improve operating efficiencies, make better decisions, protect taxpayer’s money and run a better department. Today’s public-sector managers and leaders are expected to know their business processes, identify and close the loop on red flags, transparently document and manage workflow, and be able to collaborate with other stakeholders in order to be fully accountable to the taxpaying public and the lawmakers that represent them.
Big data and “truth is in the transactions”
While these lists of challenges may be well known, what is less widely recognized is the role that modern audit techniques, based on data analysis, can play in addressing operational performance issues. The rise of Big Data – as an approach to understanding customer trends and business opportunities – is getting serious attention, particularly in the private sector. What is also not well recognized is that auditors have for years been analyzing massive volumes of data in order to gain insight into the effectiveness of controls and the integrity of entire populations of transactions.
Organizations today collect a broader variety of data at a greater velocity than ever before. The operation of any agency relies on effective processing of often millions or billions of transactions. These include not only traditional accounting transactions -- such as procurement, payments, payroll, travel & entertainment, and purchase cards -- but also data and transactions deep within business processes and controls where information is exchanged between critical systems and applications.
The opportunity to use data to evaluate business processes, identify risks and fix processes is not new to auditors or others in government oversight functions. The opportunity now is to expand this use of technology more widely throughout government operations. Putting a magnifying glass over transaction data (and the controls that manage that data) through the use of analytic monitoring allows managers and process owners to identify fraud, errors, abuse and inefficiency -- before auditors and compliance specialists are involved.
A data-driven approach to audit, risk management and compliance provides answers from three perspectives: historical, current and future. Risk and control data analysis supports dynamic, risk-focused planning and decision making -- all based on the premise that the “truth is in the transactions.” In other words, by examining every transaction and testing from a variety of perspectives, it is quite realistic to develop a quantified and verifiable understanding of the integrity and compliance of an entire business process.
Transaction monitoring analytics can also combine detailed analysis of every transaction with overall statistical and trend analysis to help agencies answer questions such as:
- What happened and why?
- Where is the problem, and what actions do I need to take to solve it?
- What will happen if these trends continue?
- What’s the best/worst/most likely to happen next?
And perhaps most importantly:
- What actions should be taken to get optimal results?
Spinning transaction data into gold and the limitations of controls
It is not uncommon to hear an argument that a well-designed and well-implemented computer application should have all the right controls, checks and balances built into the system so that bad things simply cannot happen. This may seem like a reasonable theory, but of course in practice, no control system is infallible – at least, not if it is also going to be cost effective and involve human interactions.
The challenge for many public sector functional areas is to find that right balance of control and efficiency. Too much control and nothing gets done efficiently. Too little and the risks of fraud, error and abuse run rampant. On the other hand, by monitoring transactions as they occur -- or shortly afterward -- it is realistic to achieve a factual understanding of the extent of risk that a problem has occurred, together with the details of red-flag transactions that must be addressed.
We live in an age when technology and computer systems are increasingly pervasive, and yet fewer and fewer people really understand how they work. By spinning data analysis into golden insight, and then applying that information to the assessment of controls, federal leaders can move beyond treating the computer as a black box and start to do the following:
- Self-scrutinize and assure audit readiness.
- Improve efficiency – doing more with less.
- Identify and respond to risk more effectively.
- Improve transparency and public stakeholder interaction.
- Adapt to changes in the way people/employees create workarounds and adjust their processes when they know they’re being monitored.
So, where are the typical risks? Examples of inherent controls in core business operations that, if not properly managed and monitored, can endanger agencies and drain taxpayer funds include contract non-compliance, payroll errors and abuse, misuse of purchase cards, duplicate vendor payments, tax fraud and benefits fraud. All of these can contribute to significant losses and decrease the public’s confidence in government efficiency and accountability.
To get started in terms of identifying and mitigating these risks, your audit team can be the key to the design and implementation of a purpose-built risk and control analytics strategy, which will support complex and value-added testing, issue management and long-term sustainability.
Aligning the implementation of analytics tools with the planning process and overall strategic objectives will help ensure the most bang for your buck, so to speak. Success comes from applying technology across broad financial, operational, and business systems. By combining specialized data analysis software with management’s responsibility to monitor risk and controls, forward-thinking agencies can transform data, as well as risk, into financial and operational improvement and opportunity.
John Verver is vice president of product strategy and alliances at ACL, an audit and risk management company. ACL provides solutions for various departments within a wide range of organizations including government entities, public sector agencies, banking and financial institutions and health care organizations. For more information, visit www.acl.com.