If we are looking to use data as part of an audit, we need to know what broad approach we will use before we start.
How specifically we will scope and plan the use of data. How we will determine what to profile and what to test.
Each of the approaches detailed here will satisfy a different need.
We need a plan. We need to decide on an approach.
If we don't, we may not get the outcome that we're looking for. We may not satisfy the audit objective.
The four approaches detailed in this article have different levels of complexity.
Not an exact science, but it looks something like this:
What we need and what we can deliver will guide the approach to take. The factors include:
Each of these approaches will be valid, but we need to pick the right one for our circumstances – using the factors listed earlier.
Once we choose an approach, it doesn’t have to be set in stone. We may change our mind later as new facts emerge or if we decide to go deeper.
When we get to the reporting stage, we know that there is some data that we can use. To enhance our report so that we can provide context for the audit overall. This could mean using data purely for graphs or charts in the report.
As an example, we may have an audit that focuses on supply chain risks and modern slavery. If we have foreign (or multinational) suppliers, we might want to provide some information about where known victims of trafficking have been exploited, for context:
Often, you want to know about this data up front. The more experienced audit teams will usually begin with the end in mind. They know what it is that they want to see in the reporting.
But sometimes you go through the audit and you don't know what you're going to find. As you get into reporting, you realize that there's some data that you need to be able to provide context.
Regardless of when you decide to adopt this approach, it can provide value. There are better ways to use data, but sometimes this is all that you need.
If this is your sole approach (i.e., you take this approach for all of your audits), you may be missing an opportunity.
Consider one of the other approaches, perhaps initially for a select set of audits.
Most processes are designed in a fairly standard way. Often we determine what the tests are going to be upfront based on what we think would be the risky areas. Sometimes it's not based on risk, but on a library of routines.
A note on pre-existing libraries of routines
Many audits are artificially constrained because we try to use existing test libraries.
The audit analytics software vendors tout these libraries as benefits.
But are they really that beneficial? Or are they limiting our ability to think properly about the risks and what we need to do?
PRO: these can help when you don’t know where to start or if you are focused squarely on fraud/compliance.
CON: if you rely on them, you will likely miss significant opportunity to provide better value – to properly deliver on your audit objective.
The bottom line: they can be helpful, sometimes. But be careful with them. (We find such lists to be limiting, so we generally avoid them.)
Traditionally, when we focus on a process, we understand the various individuals involved in the process, the procedures that have been documented and the systems that are involved.
We follow that process from start to end, as it has been designed, to identify risks and controls. We then test the process using the data that flows through it.
If your audit team comes from a largely financial statement audit background (a.k.a. external audit), then you would typically follow a process focused approach. That's what external auditors do. Now, financial statement audits are pretty stock standard. The thing to think about here is, is that the most relevant approach for the audit that you conducting? Does it make sense in terms of what your audience needs?
Now before you select this approach, ask yourself: what is the objective of the work?
Are we focused on ensuring that the process is working or that the process is effective/efficient?
Or are we focused on ensuring that a particular outcome is being achieved?
If the former, focus on the process. In most cases though, arguably, we want to assert on the outcome.
Are we achieving the right outcomes, for example with:
Regardless of what the process is, are we meeting those expectations?
The hypothesis then is that the expectation is being met. We can use data to help prove/disprove that.
For example, if we are recalculating revenue to confirm that we are both charging appropriately and treating customers fairly.
We start by understanding the broad expectation, gather the data that we need, model/calculate the expected result and compare that to the actual result. We then explore any differences, fine tuning our model iteratively. It is only at this point that we will look at processes or procedures, to work out exactly why there were differences. This helps us avoid unintended bias and opens us up to the full range of improvement opportunities.
As an auditor, with this type of approach, you can get a lot of personal satisfaction from your work. Knowing that you're contributing to something bigger, often directly related to your organization’s purpose, not just critiquing a process.
You are also providing a very specific, granular set of improvement opportunities, making it easier to quantify any errors. Importantly, you are making it easy to fix the errors and help prevent them from recurring.
Sometimes this is called “data driven” or even “data-led”. But what exactly do those phrases mean? I have yet to come across any recognised, established, consistent definitions for these phrases. Sometimes they are used interchangeably, even in the same breath (or sentence). Other times, “data-driven” is detailed as the opposite of “data-led” and the two are compared!
The terms will mean different things to different organizations. Some might say that they are just buzzwords created by industry bodies, consultancies, and the like. I tend to agree.
Regardless of these differences, “Exploratory” means that you let the data do the talking. There are no pre-defined ideas. What you see in the data will determine what you do and potentially even how you do it.
This may sound like a bottom up approach, and it largely is. But don’t discount it yet. It has merit. And it is on the far right of the spectrum because it is complex and can easily go wrong. Now don’t confuse “advanced” with “most mature”. It may not be the best fit.
Care needs to be taken with this approach. It is likely to fail if the organization that you're working with doesn't have a strong data focused approach to executing on its overall mandate. If you have poor data quality or inadequate data governance practices, this is going to be difficult to achieve. Unless you are Spotify or Netflix or Uber or Google or Facebook, or one of those organizations that have really adopted data and data driven approaches, you're going to struggle to get a data driven audit. You're potentially overreaching.
And even if you look at some of these organisations that we mentioned, at least for a few of them, the way in which they approach their “data driven” management would largely be in their core business areas. For example, for streaming media services, that would be how they recommend new material to their audiences, how they select new material, how they identify new shows to put through, how they determine what people are watching, etc.
As an auditor, when you are looking at those areas within those organizations, it makes sense to take a data driven approach.
If you are looking at ancillary back office processes, you may want to go back to one of the previous approaches.
The first – data for reporting only – may be relevant for some audits. But not all.
It is becoming more and more prevalent. For good reason. We have had occasion to use this approach.
The fourth – exploratory – will only make sense for certain organizations and certain subject matter.
If you have poor data governance or poor data quality, maybe one of the other approaches will suit better.
That leaves us with process and hypothesis.
Which of the two can help overcome these common challenges?
A reasonable way to tackle all of these: start with the expected outcome.
Now read the three challenges again, this time thinking about replacing the process focus with an outcomes focus.
Do you see the differences?