5 challenges to tackle – data in audit

If you are an assurance leader (internal audit/performance audit/risk assurance), you want to:

  • provide assurance to the Board and to management.
  • help maximise value (efficiency, effectiveness, economy) and customer satisfaction.
  • ensure that compliance is maintained.

As part of your overall assurance program, using data can help you achieve your objectives of finding both upside (e.g., revenue leakage) and downside (e.g., control breakdown) opportunities.

Evaluating full data populations – going beyond sampling – vastly improves coverage, and enables higher levels of confidence in results.

You have started using data for your audits, but something is just not quite working, and you know you can achieve more.

 

So first you need to find the problem(s). Do any of these sound familiar?

 

Five data in audit challenges:

  1. Access to data – you can’t get all the data, or you can’t get it quickly enough.
  2. Low value – the analysis doesn’t provide new insights.
  3. False positives – too many of them; results are overwhelmingly noisy, distracting your focus.
  4. Superficiality – the results are not deep enough to properly understand and refine the problems or to provide opportunities for improvement.
  5. Timing – the results are not available in time for reporting/concluding.

 

How you can overcome these challenges:

  1. Access to data – Agility and flexibility in approach: By adopting an iterative approach, you can reduce the burden on management and get a quicker initial result. A win-win. Start with a small set of data that can shape an initial set of results; discuss among the audit team and build on it by adding more data if required and exploring in more depth. Then, after further discussion, and perhaps debate, refine with additional data and/or further analysis.

  2. Low value – Up-front identification of hypotheses: Test libraries are old hat. They can be useful in identifying some common tests for specific assurance objectives, but their value-add potential is limited. Ask your team to abandon this outdated approach and instead focus on your organisation’s unique strategic objectives and get the team to brainstorm ideas.

  3. False positives – There are techniques to deal with these (e.g., supervised machine learning). Access to the right combination of tools can help alleviate this problem too.

  4. Superficiality – Validate assumptions and outcomes as early as possible. In some cases the agility and flexibility in approach outlined above can help overcome this challenge, given the iterative refinement and progressive levels of depth.Avoid over-emphasis on dashboards. The use of visualisation tools (a.k.a. dashboarding) is important and necessary, but if it forms the bulk of your “analytics” effort, without actual underlying data cleansing, blending and analysis, you may end up with superficial results.

  5. Timing – Try to plan for the analytics work well in advance of the audit – generally 3 months before the audit is due to commence.In some cases, the agility and flexibility in approach outlined above can help as you won’t be waiting until the end for results. In some cases, you may need to accept the outcome, particularly if the work was highly experimental; in these cases, make sure to carefully outline the lessons learnt, how to avoid the same happening the next time around, then feed that into the planning for future projects.

 

What else is your team struggling with? And how are you removing those blockers?

More on this in the related series of podcast episodes, starting with this one: Episode 5.