Risk Insights Logo
The Assurance Show

Episode 21 | Four ways to approach the use of data for internal audits and performance audits

September 21, 2020
Episode Summary


In this episode we discuss four (4) approaches to using data for internal audits and performance audits.

  1. Reporting only
  2. Process focused
  3. Hypothesis-based
  4. Exploratory (a.k.a. data driven or data led)

Link: Related blog article

Episode Transcript

Yusuf Moolla: Today we're going to talk about different ways in which we can approach the use of data, for internal audit and performance audit. So, this is, where we're looking to use data as part of an audit, what is the broad approach that we're going to be using? We normally need to know that before we start; how specifically are we going to plan the way in which we scope the use of data and then use data; how we're going to profile, how we're going to report? The different approaches that we'll talk about today will each satisfy a different need, and we can obviously change as we go along, it's important that we determine upfront what it is that we're going to be doing.

Conor McGarrity: We're going to talk about four common approaches in general, ranging from the more basic end of the spectrum towards the more advanced end of the spectrum.

Yusuf: We wrote an article on this, and it's such an important topic. Before you even think about touching the first line of code or first set of data, what is that broad approach? It's important to auditors because if you start with the wrong approach, you may not get to the outcome that you're looking for so selecting the right approach and knowing what the pros and cons of each approach are, will be important at the beginning.

Conor:  So, before we even get into the four approaches in detail, what are some of the factors that we need to consider that will help guide our choice?

Yusuf:  Four key factors. The first is what is the audit objective? So, what is the audit trying to achieve? The second is what are the deliverables that are expected out of the audit. Then what resources do we have? So, what's the capacity within the team that we have to enable the use of data. And then what capability levels do we have, and the capability levels will determine, how deep or light we need to go. And then obviously how mature we are in using data for our audits; depending on our level of maturity as an audit team, we may need to go either towards the more basic end or closer to the more advanced end.

Conor:  Okay, so bearing all those factors in mind, then we need to determine which one is right for our circumstances. Let's talk about the first approach, and that's one where we'd be using data for reporting only. What exactly are we thinking about there?

Yusuf:  This is where we know we're not going to be using data for any other purpose, but for reporting. Now sometimes we don't select this approach upfront. We get through the audit and at the point at which we're getting to report on the work that we've done, we decide that actually it would make sense to have some data used within our report in order to provide some sort of overall context or overall profile.

Conor:  I'm just thinking here, is there ever going to be a circumstance where we don't at least use data for the reporting? Basically, do we ever do an internal audit, or a performance audit where we, where we wouldn't use data in some sense for our reporting?

Yusuf:  So, your question was are we ever going to conduct an audit where we don't use data at all? I know there's lots of internal audits and performance audits that have been done over the years where no data was used at all. I think it's a missed opportunity because if you're not at least using data to enhance your reporting, you may end up with a report that doesn't properly contextualize the environment in which the work is being done or the work is being reported on. So, in the performance audit world, you get all sorts of topics where we're talking about things like the effectiveness of something or compliance with something. And when you're doing that, you may have selected a sample of entities to report on, and those would be the entities that you do a deeper dive on.  If you contextualize the broader environment to explain what it is that you have covered and what it is that you haven't covered, then you've at least provided a picture as to what the audit looked at in relation to what the broad environment was. And what that means in the context of the overall perspective. So what I mean by that is let's say you have 50 entities in the government sector that you operate in and quite often you would then look at only five or six entities as a deeper dive.

If you're able to explain where those five entities lie in terms of the broader environment, you're then able to demonstrate the basis on which you're drawing a conclusion.  We had an audit that we were involved in not too long ago, where we were evaluating the impact of a community service that's being provided.  79 entities in that region, we covered five entities.  5/79 is roughly 6 to 7%.

But the actual population that we covered, by looking at those five entities, was one eighth, 12.5%. Which is reasonably significant in the context of an audit. And so even if all you did was provide that context, that explains that the sample that we selected covers this portion of the population and we're using data for reporting in that context,  there's an easier story to tell, an easier extrapolation of what it is that we found to broader audit findings.

Conor:  Like we said, this is number one of the four approaches. So, it's towards the more basic end of the scale where we're using data purely for reporting only. So, it would really be rare or exceptional circumstance where you wouldn't at least be using data for reporting either in your performance audit or your internal audit.

Yusuf:  There would be some situations. I can't think of too many off the top of my head right now.  Because in every situation that we've had, where we were involved in either an internal audit or performance audit, we've been able to find some data, either open data or proprietary data, that was able to at least provide some context for our reporting.

And people love to see data and statistics to be able to contextualize what it is that they see.  It can't be the only approach that you use, but if it is that you're not doing anything at all, this is a good place to start.

Conor:  And of course, if you're using data purely for reporting, like you said, to contextualize issues or put some broader environmental data around the specifics of the audit you're doing, it's always useful to speak to the stakeholders and understand what they would like to see and how they read.

Yusuf:  Sometimes people know what it is they want to see if they've seen these sorts of things before. Sometimes they haven't seen this sort of thing before, and they don't know what is possible. So, there may be a combination of asking people what they want, but also thinking about what it is that you might be able to use.

Conor:  The main message there is that if you're only using data for reporting purposes, there's still a lot of value that can be gained from even that more fundamental approach.

Yusuf:  Let's go to the opposite end of the spectrum and talk about exploratory for a sec.  We call this exploratory for lack of a better, accepted, term. And this is on the more advanced end, sometimes you would hear it being called data driven or data led, but those terms don't really have proper definition.

You can look around and different people talk about it in different ways.  Exploratory means that you let the data do the talking. So, you don't actually go in with any predefined idea. You work out what it is that you are going to look at based on what you see in the data.

it is largely bottom up because you're not coming up with any, hypothesis or direct objective. there is, benefit in it, but it is really on the extreme right of the spectrum because it's complex and it can go wrong easily. it is more advanced, but it doesn't mean that it's most mature. Most mature is usually the best fit. This is not necessarily the best fit.  can be very valuable. It is advanced, but the challenge with this approach is that  if you working in an organization, or if you conducting an audit relating to an organization or an entity that doesn't have a very strong data focused approach itself.

Then you might fall flat. So if you don't have good data governance practices in place, if you don't know where all of your data is, if you're not already using your data to drive business activity,  you're not going to have the level of quality that you need to be able to take that exploratory approach.

You may not have enough data.  the data that you do have may not be of sufficient quality. In terms of accuracy, in terms of completeness, in terms of actual integrity of the data. unless you're one of those entities that have significantly advanced, in the way in which data is being used or have started off as, data-driven type entities, you will struggle to get to, an exploratory approach.

Conor:  So, who would you be thinking about there in terms of those advanced users of data?

Yusuf:    the typical players like Google Netflix, et cetera.  Netflix is an interesting one where we know that.  If you do have a Netflix account and many people do, after you've watched something, they're able to recommend new shows to you based on what you've seen or based on your history.

And as that builds up and builds up and builds up, they’re then able to use that to recommend more and more shows to you that largely would be in line with what you like.

So, what exactly your viewing history is. They know what it is because it's all being recorded, right. So, they know what you're accessing. They know how many times you've watched it. They know whether you've watched part of the episode of the whole episode.  They then use that to provide you with suggestions in terms of what to watch next.

But that's in relation to their core business, what they provide to viewers.

The way in which they would have back office operations, things like accounting and payroll and the like wouldn't necessarily be as data driven as their core product offering analytics would be. So the way in which they use data to provide data-driven recommendations within their core product offering and the governance that they would have around that data, the way in which they would use it, the sophistication that they have, would look different at the front end than it would at the backend.

And because that's not their core job function, it's not the organizational mission to have a hundred percent accuracy in financials, for example, you would often find that if you're doing an audit on the front end - the marketing and the product area, then you can take a data driven approach. But in the backend, you probably going to need to go for something that's a bit more traditional.

Conor:  Can I just go back to the term exploratory and you said there that it means that you let the data do the talking and you've got no predefined ideas before you head into your audit. Now that sets a few alarm bells off in my head. That sounds pretty time consuming and resource intensive.

Yusuf:  Not necessarily. It depends on the nature of the organization that you're working with. Taking an exploratory approach doesn't mean that you are exploring everything manually.  You may have worked towards setting up systems that enable you to identify exceptions as they occur. You may be looking for exceptions and that means that you may be discounting a whole set of standard, activities that occur when you run continuous controls monitoring processes, for example, and I'm not suggesting that this is the way in which to do it for audit, but  you often would start off with situation where you get lots of false positives, lots of exceptions, and then you start slowly narrow that down to identify what the real exceptions are, right? So similarly, when you conducting an exploratory type exercise,  when you start off,  you would get lots of exceptions and you would be finding lots of different things, and you either manually, or in an automated way, can slowly start to understand what exactly you need to be focusing in on. And as you get more mature with that, you're able to directly target specific exceptions and specific scenarios to be able to focus your audit activity.

So even exploratory approach. at the far right hand side or more advanced and of the spectrum, it sounds as if you still need some input or insights from the business, where some exceptions have been identified.

Not sure about that. you would want to be talking to people within the. management teams, but you also don't want any bias to exist within the work that you do. So yeah, you obviously going to be taking a lot of their input into account, but often there are instances that you want to identify and investigate yourself, regardless of what management had to say about it and what their reasons might be.

you'd still need to have that unbiased approach to the work that you do.

Conor:  that rounds off exploratory. the key thing being there that that's pretty advanced approach and you really need good quality data before you even consider heading down that route. we've got two more approaches then that sit in the middle what are those two approaches?

Yusuf:  The one that's a little bit more on the basic side of the spectrum would be the process focus. And then the one we'll talk about last, which is a little bit more, I would say advanced, is the hypothesis approach.

Now process focus is exactly as it sounds, you understand inputs, you understand process and you understand outputs, right? So, this is the traditional approach. The most common approach that we use for audits. We evaluate the design and the implementation of a process - we're looking for gaps or weaknesses or other opportunities to improve.

Generally, processes would be designed in a reasonably standardized way. So, we work out what the tests are going to be upfront based on what we think the risks might be. And often it's based on a walk through with the business, or we've conducted an audit in that area previously but in a lot of circumstances, the determination that we make in terms of what it is that we're going to be looking at and what the tests are that we're going to conduct are not based on risk, but they're based on a standard library of routines and preexisting libraries of routines are something that constrain us, and have been constraining us as auditors for many, many, many years.

I'm not saying that there's no place for it.  But the analytics software vendors often talk about these libraries of routines as one of the most significant benefits that they have built into their software. I think that it limits our ability to think about what it is that we need to do.

It does help us by providing some sort of starting point. It may help us to determine whether  the testing that we're going to be doing is, is complete to an extent, but that's usually when you focused on fraud and compliance, so that's,  conformance, if you're looking at performance or if you're trying to have a focus on performance,  those are not really going to help you.

There may be a few tests here and there that might be beneficial.  but they're not really going to help you much.

So, we look at each of the inputs, we look at the actual process, and then we look at the outputs. We then use those inputs to remodel the process, to see if we get to the same outputs.  and then that helps us determine whether there are any gaps or weaknesses within the process Now often the process focused approach is something that you would get when you do a financial statement, audit external auditors, usually follow a process focused approach. There's nothing wrong with that.  But financial statement audits are looking primarily at the risk of material misstatement in financial statements - one risk. When we look at internal audit to performance audits, we're looking at a much broader, more varied set of activities and expected outcomes.

So, when we focus on the process we're working on, is that process working. What we should be doing is looking at whether we’re achieving the outcomes that we need to achieve. If we need to focus on the process, if that's all we really care about then cool. But that's usually a conformance thing. The minute we start moving into anything that relates to performance, we need to think about it differently.

Conor:  Just want to pick up on one thing you said there about, preexisting libraries of routines.    do you have a view whether over-reliance on those unduly limits the scope or the potential value of an audit?

Yusuf:  The most basic audit that you could do that uses data is a payroll audit. Over the last couple of years, we've seen so many organizations that have either not conducted data focused audits or have been using standardized sets of routines, caught not paying staff appropriately.

It's been shown that at least a few of those were because there were standard routines used. So, there's 15 routines that you do in payroll, run those 15 routines on the data that you have and see what outcome you get. And compare that to the process. If you really want to understand what's going on, you need to say, what is the outcome that we're looking to achieve? What is it that we want to get out of this process and then start thinking about the different scenarios that are specific to that organization or that entity?

When you get into standard routines-based analytics, you could say, is there overtime or isn't there overtime? And if there isn't overtime, well, we don't really need to care about it. If there is overtime, then let's check that the overtime that's been captured into the system has been paid for.

Now, if that's all you’re doing, you’re missing massive opportunity or massive potential risk. So in the first case, if you don't have overtime, you do need to go and understand whether, regardless of whether you say you're going to pay overtime, is there an expectation that you should be based on what's not in the contract?

So that's a much longer discussion, right. But just as a basic, if you're not capturing data in your payroll system around when people are working. You're not going to be paying them that overtime because you haven't captured it in the payroll system, but thinking more broadly, what defines when people are working.

And now that we're working remotely a lot more, that has been changing as well. Right? So, it's not just about what data is captured in a time sheet somewhere. It is about how are people accessing systems? How often are they accessing their emails?  Are there ways in which we can identify when somebody is working versus when they're not working?

Conor:  Okay. So that wraps up process focused. What's the last approach to using data in your internal audits or performance audits we want to talk about?

Yusuf:  This is about focusing on the expectation or the outcome. Understanding risks and opportunities more broadly. So, the hypothesis-based approach is where we look at - are we achieving the right outcomes?

There may be different ways in which we think about that. depending on the audit objective, have we achieved the right outcomes in terms of any promises that we've made our customers or expectations that they have, in the compliance world, it will be compliance with third party contracts or legislative obligations.

in the public sector, particularly when we're thinking very, very broadly about certain matters,  and doing performance audits, is, are we meeting policy intent regardless of what it is that we're looking at, and, we may be looking at any number of internal audit topics or performance or topics, but the objective is not a process. The objective is an outcome. And so, what we look at when we conducted a hypothesis-based approach is that we don't.

Limit ourselves to inputs, process and outputs. But we think about what it is that we trying to achieve overall, and then go backwards  to understand what does that mean in terms of the expectations that we have, and then work out, are we meeting that expectation and we use data to help work out whether that expectation is being met.

Conor:  And so, it may well be that if you're taking the hypothesis-based approach. So again, you're looking at what's the outcome that is intended by this policy or this legislation and so forth. There may be an opportunity later on to do another audit that actually looks at the process.

Yusuf:  remember this is about the broad approach you're going to take is this is not saying you're never going to evaluate the process. So you may get to a situation where you've outlined three or four key hypotheses, but for one of them, you going to go through  and understand input process, and output and understand what that means in terms of the outcome. Taking a hypothesis-based approach doesn't mean that you're going to ignore the process. It just means that you're going to be focusing on something that's bigger and it may or may not involve evaluation of a process. if the outcome is not being achieved, you may then go to the lower level and say, okay,  let me understand this process as part of the same audit, an example of where we've taken a hypothesis based approaches, where we look at something like recalculations let's say we pay commissions to third party is that, refer business to us. You can either say, let me look at that process and understand what data is going into their process, understand how the process works and understand what the output is.

And I'll take that source data. I'll run it through the same procedures and compare it to the output. All you're then doing is redoing management or the entity have done themselves.  when you take a hypothesis-based approach, you say the, expectation is that all payments are complete and accurate. And then what you do is you say, if this is my expectation, I want to be able to get to that. Let me take the source data that I need to be able to get to that. And model that based on the expectation. you create what the expectation is by using that source data. And you compare to the actuals.

So, you don't use the same process that management have used. You don't follow that same process. You use a parallel process to create that expectation compared to the actuals, and then determine whether they are any differences between the two in ignoring the process that management has undertaken you minimize the potential for bias in the way in which you conduct that recalculation. So that's one example of how, by looking a little bit differently and not just trying to follow a process, you will get to a different result than if you just blindly follow the inputs, the process and the outputs.

You don't agree with that.

Conor:  Not that I disagree. I find the conversation about bias conceptually, in this context, quite difficult to understand. So just talking about bias there, where does this fit in?

Yusuf:  in a data focused process where we've got a set of data   that then goes through a range of rules that management have decided, and then provides an output.

The example that we just spoke about. There was where we pay commissions for referrals, for products or services that we offer. There's a number of different ways in which bias will creep into a process, but in our experience, there's three main ways in which that happens.

The first is a translation of requirements into actuals. and that's the most prevalent from what we've been looking at the second is where you have, several changes to a process for which band-aids are put in, as opposed to  redefining the entire process. And those band aids could consist of things like what we've been seeing quite recently - robotic process automation - not saying that there's anything wrong with it, but it often is a band aid approach. The third is where the wrong data is being used.  let's explained each of those three. So, the first, like we said, it is where the requirement doesn't translate fully into the process.  when you have an expectation, you then want to determine what the business requirements are in order to fulfill that expectation. Those business requirements will translate into functional specifications. Those functional specifications then translate into technical specifications. A developer will go and code something that then has a set of testing that is conducted against it.

And then quality assurance is complete and then that's then put into production. In each of those steps. identifying the business requirement to determine the business rule and putting in the functional specification, translating that functional specification into a technical specification, and then actually developing that code often, you will find that there is some level of,  translation error that occurs somewhere along the chain.

And when you have a translation error, and it could be very minor, we've seen translation errors like 28 days becoming 30 days, because it was February. And they said, it's the number of days in the month. And that month had 28 days. We've seen others like, it's 365 days that we're working with and they forgot about the leap year. And then that leap year just created all sorts of problems because you've got multiplication errors, etc. And those are just the simple examples. You get far more complicated examples in that where the contract isn't written very well and a particular business requirement that was based on a handshake wasn't documented well right?  The second one is where you've had a process in place, and then over the years, you needed to make changes to the contract to satisfy changes in expectations either on your end or the third party end, or sometimes it's legislative obligations that change or, the operating environment just changes more broadly.

If you don't think about the whole process, when you are reestablishing or tweaking your rules to be able to cater for that. then often you end up with band aid approaches. And those band aids can sometimes create difficulty later on, particularly where you have rules early on that do something.

And then you have rules later on that undo what you've done, but you don't know about that because you didn't get to it. So, you need to follow the whole process. If you are going to be doing something like that.  Regression testing is where you go and you check that the new rule that you've put in hasn't broken anything that you had existing beforehand.  Now, this can be very, very complex rules.  and if you do tweak a complex rule and don’t test it properly at the end, then you could break that.

Right.  the third is where you just using the wrong data. your source system has changed, or the extract isn't correct anymore. And when I say anymore, I mean, sometimes people build rules into the extracts without thinking about it. And then when you run the rules, as they change you, then, undoing some of the rule that was created for the data extract.

Now this becomes very technical, but often it's about the data that is being used. You don't have the right granularity, or you just have the wrong data going into it, or you've tapped into a different source because you couldn't get something. So those are the three key reasons we find. discrepancies in processes, and those are the reasons we try not to bias ourselves with existing processes and try to recreate the expectation using data as early as possible in the chain.

So, we tried to go as far back as possible, as close to the source systems as possible.  and even then, because you may not be able to get all the way to the source system, there is potential for our expectation answer to be wrong based on that. So those are the key reasons that you have potential for bias if you follow the process, as it stands.

Conor:  So, let's say we've taken the data. We've done our own modeling as auditors and we've arrived at an outcome. And like you said, the outcome may be different than the outcome that's being reported by the business. What becomes the point of truth there then? How do you reconcile those differences?

Yusuf:  So then there is work to be done - sometimes the audit team is wrong and sometimes the business team is wrong, but it's only in deep diving into those differences that you get a true understanding of what is occurring. Those situations where we were wrong will fall into several buckets, but the core buckets are we use the wrong data. Or we missed a key item that wasn't detailed in the expectation or the expectation wasn't stated well enough. So, in those cases, we'll either change our data input or get the right data input or understand the nuance that we haven't understood that exists within the process. So you'll get lots of exceptions, things like, in the example that I just gave, sometimes you know, there were particular nuances around how the product was interacting with another product or a product was closed and another product was opened, but  the most important one is where the expectation wasn't very clearly laid out, and that then needs to be fixed. So, the contract or the document that explains the expectation then needs to be detailed to include those items. On the flip side, where the audit answer was correct again, you've got to go into the detail of each of those individual items to be able to determine why that's happening. And that can happen again because either the data going in was wrong or the process steps were wrong. And process steps often start off well when a process is established. And then slowly over time, as new exceptions are introduced, etc., you find that they start to veer of track. There is a lot of iteration.  You don't know what the point of truth is until you dive into those matters, but what it does give you is a very deep understanding of exactly how it is that we’re meant to achieve this particular objective as a business. And when you go to report, you can report in that way.

Conor:  And so that's one of the great benefits, I think, from a hypothesis based approach if the results of your,  hypothesis based audit are different than the perceived expectations of the business, it gives you an opportunity to have that bigger conversation. did we even have clear expectations as a business in the first place?

It's a real reflection point.

Yusuf:  the questions that we usually ask ourselves in determining, are we going to follow a process-based approach, or a hypothesis-based approach? And there's three key questions, right? The first one is if we're going to take this process-based approach, are we going to be able to see what is not there?

if it's a simple process, if it's something that you've looked at before,  you know what it is that you're looking for, it's very clear, then you may not necessarily need to go down, hypothesis based approach, which is a little bit more difficult, but if you don't know it very well and you not sure that you're going to be able to identify all the potential gaps and risks,  then the process focused one is not where we're going to go.

Easier to see what is not there when you take a hypothesis-based approach. The second thing is when we want to evaluate the process in depth. We want to understand exactly what is going on. But at the same time, we want to be able to provide a broad understanding of the outcome that is looking to be achieved.

So, this is not just about understanding the process in depth and picking up some low-level opportunities.

If we are looking at a very limited part of the process, and we just want to go in depth into that. We may take a process focused approach if we want that broad, higher value, reporting and higher value activity out of the audit team, we go hypothesis based. The third thing is where, we think that there might be some errors in the process, but we struggling to find those areas.

We suspect that they are, there may have been even been told that they are there, but we were not able to find them because we just following the established process.   we then go and take the broader outcome-based approach.

Conor:  And the hypothesis-based approach is the one we use most often in our work.

Okay. the four broad approaches there - where we're using data for reporting only was at the sort of the basic end of the scale. And then we went right to the extreme end - the more advanced end - of the spectrum where some entities - few, but some - maybe take an exploratory approach, really letting the data speak to them, and demonstrate what they could potentially look at. Two more approaches fell in between those bookends. The first one being process focused, where we do walk throughs, some of the limitations and perhaps reliance that some auditors have on these predetermined sets of libraries and how that can narrow the focus of an audit and maybe not provide as much value as possible. And then lastly, the hypothesis-based approach – so, let's focus on the outcomes that are expected to be achieved, work back from there, develop some hypotheses and then test those hypotheses to see if they can be proved or disproved.

And if need be, then we can drill down to a deeper level and look at the underlying processes within that system. So, four broad approaches there. Anything else we need to add?

Yusuf:  No. Just if the discussion is difficult to follow, there's a blog article on our site called four audit data approaches. Might be a little bit easier to read and you can see a little diagram explaining where on the spectrum each of those sit and the different challenges, etc., and we'll put a link to that in the show notes.

Listen to More Episodes Like This

Conor McGarrity
Podcast host

Conor McGarrity

An authority on data-focused audits, Conor is an author, podcaster, and senior risk consultant with two decades experience, including leadership positions in several statutory bodies. He’s driven to help auditors uncover new insights from their data that help them to improve organisational performance.
Yusuf Moolla
Podcast host

Yusuf Moolla

Fellow podcaster, author, and senior risk consultant, Yusuf helps performance auditors and internal auditors confidently use data for more effective, better quality audits. A global leader in data-focused auditing and assurance, Yusuf is passionate about demystifying the use of data and communicating insights in plain language.