It has happened before:
- Mistakes in multiple spreadsheets changed the NSW State Government’s deficit budget forecast of AUD337m to an AUD680m surplus
- A cut-and-paste error cost TransAlta ~approx. US$24m - a whopping 10% of their profit for that year
- A typo caused an overestimation of student enrolments at University of Toledo - leading to a loss of US$2.4m in projected revenue
Of course, spreadsheets still have their place - I continue to use Excel for specific tasks.
But to build for sustainability and to reduce the number of inadvertent human errors, especially for large and complex models, you can reduce the risk of error.
How do you do that?
Here are a few tips:
- Review: a minimum of 2 types of QA/review for each model:i) A technical peer review (ideally by someone who has not been involved in the development of your model) to review and evaluate the accuracy of the formulae, calculations and code.
ii) A business user review, (by someone who understands the purpose of the model and the underlying business rules), to determine whether the model is working as it is supposed to - functionally.
- ProtectIf you continue to use the spreadsheet model/s, lock the calculation cells.
This provides a layer of protection from unexpected changes; however, it does not necessarily prevent other users from unlocking the cells.
- Change platformMoving the model to an analytics platform allows you to enter your variable inputs through an interface (e.g. Excel, web form, visualisation tool) which can rerun the model on the fly (behind the scenes) and produce the scenario results.
With this approach, the model can be used more broadly, with reduced risk of change to the underlying formulae / algorithms.
If you are responsible for financial control and/or models that feed into strategic decision making, it is your responsibility to reduce the risk of error. Can you afford not to?