In this episode we discuss five key challenges that assurance professionals - internal auditors and performance auditors - face in using data for audits.
This is part 1 - outlining the challenges. In parts 2 to 6, we will discuss solutions to each challenge individually.
The five items are:
So what are we talking about today Yusuf?
Analytics challenges that assurance professionals often come across. Five in particular, actually.
We're not gonna talk about why assurance professionals, either internal auditors or performance auditors or even risk assurance professionals want to use analytics as part of their overall programmes. We're going to assume that there's a good understanding and appreciation for analytics, and people have already started doing this. If you're listening to this and you haven't started, you know, using analytics as part of your assurance programme, there maybe a bit more work to do. Even still there'll be some lessons to learn from this because it is based on discussions we've had and projects that we've been involved in with various assurance practitioners over the years, So with that as a start. One of the most commonly cited analytics challenges is access to data. You can't get all the data that you need, or you can't get that data quickly enough to be able to do the work that you need to do.
That's assuming, of course, that you've identified the data you need in the first instance for the particular assurance project.
That's an interesting point. So often you need to be able to get data, analyse that data, first before you're able to then determine what you need to then do further, it's iterative. You don't always get to the exact data that you need in the first instance. So you get some data. You then use that data to determine what the answers to some initial questions you had are. And then you use those answers to determine what next to do. That iteration means that if you can't get the initial data within a timely manner, you then in the situation where getting the further data that you need to go deeper becomes even more difficult.
But there must be a jumping off point, right? You need to start somewhere, so you need to understand what it is you might need in the first instance. Before you ask the question.
With experience, you can usually get to each of the different levels that you need. Also from experience, in almost every situation that we've had, where we're using a reasonable level of analytics. I mean, if you doing sort of basic procurement analytics or some of the more basic analytics that audit teams get involved in, then you usually know what you want upfront because it's straight forward. You're gonna be doing the same sort of basic standard hypotheses - three way match or whatever. You know what you've looked for before thousands of other people have done it. It's pretty stock standard. However, when you're doing, you know, real proper analytics as part of assurance - this is when you're using data to help answer questions that you may not have thought of even before the audit or that somebody else may not have done before. And so you do need to have that iterative approach. But as you start asking the questions of the data, you will invariably find that a certain percentage of the data that you didn't identify upfront you need.
Take an example of evaluating the completeness and accuracy of payments that are made to say third parties. You start off with determining what you need up front, and you look at the high level payments that you've made. Once you understand what those high level payments are, you then want to dig deeper, particularly into those areas where there is a large volume of transactions or large value, and usually large value would go along with large volume you're not really going to be doing, you know, detailed analytics where you have a very small volume that doesn't lend itself usually doesn't in itself very well to broader analytics. When you do that, you then say ok, these are the individuals for whom or the third parties for whom we're gonna be looking to get more data. If you're able to identify that fairly quickly up front or if you already know from analysis that you date up front in the audit who those third parties might be, you might be able to identify that you need those data sets. But then sometimes you get to a roadblock, and you just can't go further because you just can't actually get to the data in the way that you thought you'd get it up front. The data is not granular enough, or you just can't access it because of limitations on third party contracts that you have. And then you then start looking for alternate sources of data. So one that we often find is a really good alternate source of data to be able to use in a situation like that is complaints data. And you won't necessarily think about complaints data up front because you know you're dealing with third parties. Why would you need to look at complaints? Quite often, you look at that to be able to triangulate payments that need to be made or payments that need to be received from those third parties. It's not always that easy to identify upfront exactly what you need. You can try your best, but quite often you then need to get to iterative requests for data.
So can I just ask you on that Yusuf, in your experience do you often, or does internal audit often get pushback on requests for data sets that perhaps are not more traditional? So not your payroll, not your procurement ERP type data whereby the particular assurance project may be a new approach or a new topic or a new way of looking at something. In your experience, do you get much pushback from the business in the provision of that data? And the reason I ask that question is quite often or sometimes with performance audits, given that they do quite unique audits every time is that you do receive a lot of questions from public sector agencies about why do you want this sort of data? What questions are you going to be asking of it? What's the ultimate objective? So just interested in getting in your views on whether that's something you've encountered in the internal audits sphere?
Absolutely. It does depend from audit team to audit team and the relationship they have with the relevant individuals that they're gonna be getting data from. In most situations that we've seen, there's some level of push back, you know what are you gonna be doing with this data? Part of that is just the level of maturity of the analytics work that is being performed particularly initially when analytics work is starting to be conducted. Because the business is not used to providing that level of data to auditors, there'll be some level of pushback. You haven't asked for this before. You haven't asked for this sort of thing before. Business process documents, samples, files, etc. That's that's where auditors have traditionally played.
Would you say there's a level of fear there almost in those situations.
In some cases, it is fear. Like if the audit team are quite heavily focused on finding problems and finding potential fraud. Or if there's a fear that the internal audit team are going to find something that the business should have found themselves or management should have found themselves. And depending on how that is usually represented to the audit committee or to the CEO or to others, then there might be a bit of fear. So fear of either finding errors that exist because auditors look for those errors or finding fraud, because auditors tend to look a little bit for fraud, or just finding things that the management should have known about already.
Okay, well, I would say that parallels the performance audits sphere, and certainly when requesting data in the performance audit world there's almost an element of education there, where the performance auditors need to educate data custodians within public sector entities about why it's been obtained, how it's going to be analysed and how it will be used.
That gets easier as more analytics work is conducted, because the more of it you do, the more of an understanding there is of how you going to be treating it and what you're going to be doing and the fact that you're gonna be doing it in the first place.
So there's always, you know, there's always pushed back when you try to do something that's a little bit different. At the extreme end of the ease of access to data is where auditors actually have direct access to data warehouses or other data stores, and they can just go and get data themselves. But even in that situation, because the data warehouses don't always have everything that you need, you still then need to go and find data elsewhere. That's the first one - so that's access to data. The second challenge is, and this is now - you've got the data you're able to do some analysis - is only providing a low value of insight. You've done all of this work, and you aren't able to provide any new insights with the data work that you've conducted.
So what's the primary reason there, Yusuf, for why you wouldn't be able to provide any new or revealing insights? Is that is that a capability thing on the behalf of the assurance practitioners or what's the main cause of that?
There's obviously a spectrum again for this. One is that you haven't actually selected the right hypotheses to test for. So you've selected hypotheses based on what you know. What you know won't necessarily significantly change as you go through the years. And so you're expecting that the data is just going to suddenly give you an answer that you had no idea about before and that expectation can just fall flat. So low value because you're not able to get beyond what you thought you would get beyond. And that's because of a little bit of absence of imagination or absence of detail in the in the upfront thinking. The other is that sometimes there just isn't anything to say. So the process is working well. Everything seems to be going quite well. And despite the level of detail that you've gone into in understanding the process and understanding what might go wrong or what other opportunities there might be, you're just not able to generate anything new. The business is doing well with it, and that's really a a good thing, right? The problem with that is that you have spent quite a bit of time and money to generate some insights, and you haven't actually generated anything new. The reality is that you are able to reflect that what might initially appear to be low value as actually providing insight that everything is going well. This is a bit of a misnomer. Low value.
I was about to dispute that actually, yeah.
Yeah, in the one instance, the first instance we spoke about, it really is low value. You just haven't been able to do what you need to do to find anything that's new and provides a different perspective. You haven't brought different data sets together, etc. So that's in the first instance, it is. In the second, it's not a real issue. It's a perceived challenge. You think that you need to provide some level of new insight, but actually you don't if you're able to show that management are doing what they need to do.
You can still provide assurance to management that things are working well. The controls or the risks are under control or being mitigated, or the business is performing as intended.
That's right, and so that mindset shift is from: needing to find findings or opportunities for improvement if you like to providing that level of assurance. Because really you're not engaged to find problems. You're engaged to determine whether things are working right?
Ok, so that's the second one.
The third one is around false positives, and this comes up very often. So this is where, in a situation where you've analysed a whole bunch of data, you've looked through various types of rules, and you've got to a point where you're finding so many exceptions, that the results are just simply overwhelming. They're distracting you. You can't focus on what it is that you need to focus on. There's a lot of noise in there, and you just don't know what to do with that. That happens quite often, because you're dealing with large data sets. You can quite easily get into a situation where they are just so many exceptions that you don't know what to do with them.
And I'd imagine from time to time, particularly maybe for more junior people that can be quite a confronting scenario. Where you have all these exceptions, you've done lots of good work. And you're just spinning your wheels, thinking, what the heck do I do with all of this.
Part of it stems from the nature of analytics. You're not looking at 25 samples. You are looking at 100% of the population. You're bringing different data sets together. You will find a range of exceptions that can be either explained or can't be explained. When you're in a situation where you have that mountain, it is often quite difficult, particularly in the early days of using analytics within assurance functions. It can be quite difficult to work out what to do next with that, there's just so much of noise.
So what are some of the practical steps that people who find themselves in that situation can take?
This first episode is going to be about identifying what the challenges are. The next episode, directly after this, will be outlining those challenges again and then explaining what the solutions are. So you have to listen to both to get them, unfortunately. But because it's such a broad topic, we did have to split this up into first identifying the challenges and then working through the solutions.
So this is a bit of a teaser episode for the next one for you listeners out there.
We've done number three. Number four is superficiality in results. The results are not deep enough to properly understand and refine the problems or to provide opportunities for improvement. They link to the previous three challenges. So access to data, low value and false positives. But this is quite a specific item as well and does extend beyond that. You're in a situation where you either haven't got enough data to be able to get to the specific answer that you need or you haven't gone through and understood the process properly. And, you know, designed your rules with the right level of granularity in order to be able to get to that proper result. Or you're not able to provide opportunities for improvement because you've done all of the analysis using the data that you have. But you weren't able to use any data or other techniques to get to the root cause analysis to understand exactly why things were going wrong and what the opportunities for improvement are. Do you see that in performance audit?
We do see that in performance audit. Not as much perhaps recently as used to be the case. Like with any new processes, you know, like you said, that the maturity grows over time. But because they're quite a big budget, generally attached to performance audit, there's this quite a groundswell of expectation that a performance audit that uses analytics is going to tell me all these new insights and surface all these problems and all these opportunities that we can deal with. That in itself creates a heightened expectation that if you're getting all this data for the performance audit that you will achieve all of that. Sometimes you might get 70% of the way through the performance audit and you think, guess what? We're not gonna be able to have all of these things that perhaps we thought we would have through the analysis of the data. So it can be problematic. Comes back to probably one of the key things you and I both learned early on when we started in our auditing careers, Yusuf, is under promise and over deliver, which is an old adage that people that there would be familiar with. Because there are so many unknowns with data in itself to try and really get a good handle on what you're looking at initially before you can actually talk to stakeholders about what they can expect to see through your analysis.
Okay, so that's the fourth one and then the fifth one. We used to just think about it in terms of timing and what we're seeing more recently, it's a combination of timing and reporting. So timing being when the results are available. But more broadly, it's about the actual reporting. And I think timing is a is a sub component of reporting, really. Reporting is the last one, and this is a super set of a few different challenges. The main one being timing. So the results are not available in time for reporting. The other is that the work that is conducted is not suitable for reporting, tying into a few of the previous challenges that we had. But importantly, what's been happening quite a bit that we've seen over the last little while is that the reporting is not attuned to the needs of the audience. We are producing visuals or other outputs that are "cool" but not necessarily providing what people need. And then we get feedback that the reporting is not great. And we don't know why. Because we thought that we did a really good job with some nice colours and graphics and things. But the feedback from the audience is that the reporting is not great. So that's a challenge for us, because that is something that we then have to deal with.
You have to really know what their needs are and how they're gonna use the outputs that you're producing.
So we spoke about five things there. We spoke about access to data, provision of value, false positives, superficiality in results and then reporting with a focus on timing. Which of those do you think is the most challenging item for conducting performance audits?
I'm going to go to either end of the five principal spectrum there and say I would weight number one - access to data - equally with number five - reporting and audience. Why is that? Because I think we have some way to go in working with our data owners and custodians and stakeholders about why we're taking extracting that data from them. How we're going to use it and what we're going to do with it as we've picked up earlier. Equally, though, I think we need to understand, given that we have a few different audiences. - how do we actually portray this data in the most meaningful way to actually change behaviour or have an impact on how something happens in the public service. So I would have to say one and five for me in terms of performance auditing are probably the biggest challenges at the minute.
So that's quite interesting, because I think within the internal audit world, if I have to reflect on those five challenges, I would say that Number one has been a perennial challenge for most organisations. Number five is coming to the fore a lot more now and then those become less challenging as the level of maturity increases. And that's when the other three come in in terms of detail challenge. So value and superficiality would probably appear first. False positives can come up at generally any time, but more mature functions are able to deal with them a bit better, so it's almost like one and then five similar to yours, converging into the other three as you get more and more mature. Maturity means that you're actually getting to a higher level of understanding of what it is that you do. And then some of the nuances then start to play in. But one and five, for sure, would be the high level items. So similarly to performance audit, most internal audit functions would struggle with those.
To my mind, Yusuf, one and five are probably the two of the five that have the most external facing input into the audit. So that's where you're dealing with people outside the team or even outside the organisation, to actually get some assistance for you to do your job. Whereas two, three and four can be worked on capability wise over time, potentially internally within the organisation, within your performance audit team.
There are few things that we can do internally as well in terms of number one and in terms of number five. Most of it is, though, based on relationships that we have with externals. So based on even our understanding of what it is that they need.
Five assurance analytics challenges that we'll then have a solution to, or solutions for, in the next episode. First one is access to data. Then low value of results. False positives, so too much of noise, Superficiality in results, that's the results are not deep enough. And reporting with a focus on timing and audience. Thanks Conor.
Thanks Yusuf. Stay safe and we'll catch you in the next episode.