Imagine ‘This file is corrupted and cannot be opened’ popping up on your screen when you boot up your machine and are ready to begin your days’ work.
This message appears on your one-stop, all-encompassing, security-protected, macro-driven spreadsheet that you depend on entirely for the output of your daily, weekly, monthly quarterly data gathering, validation and management reporting responsibilities. In short, your whole workflow, the one thing you rely on, your golden source of truth, the thing that gets you where you need to go...just let that sink in. The consequence of such an action would feel catastrophic and the work to recreate, again would appear daunting! Ok you may have a backed-up version so may only lose one day’s effort, but the risk is very real.
Now, it may appear I am against spreadsheets, not a bit of it, since Dan Bricklin and Bob Frankston developed VisiCalc at the back end of the 70’s the rise of the spreadsheet has been astronomic, with approximately 1 billion users globally on any given day. Many of the best calculative innovations have most probably started life on a blank paged grid of a spreadsheet and I can attest to examples of extraordinary spreadsheet developments, from very talented individuals, from mass data reconciliations, profit and loss calculations, value at risk models, to building out financial models to various ends. All of these have proved incredibly useful in driving business decision-making.
Despite other technical innovations, the reliance and indeed over reliance, on this incredibly useful tool persists. Spreadsheets continue to create avoidable risks in the areas of control and auditing of operations. Prominent risks identified speak to multiple issues relating to multiple versions of one spreadsheet and data integrity, user skill levels and key person risk, hidden data, unverified calculations, lack of auditable history and data encryption. It would be interesting to see how many of these valid risks make it into organisational risk registers. Notwithstanding this, there are now a host of technical challenges around scalability and the ability to support decision-making in real-time based on constantly refreshed data that affect the performance, accuracy and quality of a spreadsheet.
Two risks that have the most potential impact, without downplaying the others mentioned, are the Key Person Risk and Data Integrity.
Key Person Risk
Very often individuals take ownership of spreadsheets and key person risk really becomes an issue during prolonged periods of absence or when that person leaves, and a critical issue materialises. Without adequate knowledge of a spreadsheets design and process, unpicking the logic and methodology can be an extremely arduous task with no guarantee of success, especially if there are embedded macros that need to be pulled apart! In many instances, it may well be prudent to redesign and start all over again, but that runs the risk of finding yourself in the same spot at some point in the future. Procedural documentation can go some way to easing this burden, but the risk itself remains.
Data Integrity
Another prominent risk is data integrity. How can we be sure that the data presented in a spreadsheet is free from manipulation? Do we assume that this is the golden source of truth? How do we find comfort that the data we observe has the correct assumptions and calculations to provide the desired result? In simple terms, this relies on the integrity of the data handler, a regular audit of the spreadsheet methodology, inputs and results combined with a regular risk monitoring programme to ensure validation on an ongoing basis. That is a lot of trust and effort expended in the mitigation of this risk, time and effort that could be better spent driving your business forwards!
So, what are the alternatives?
There are two approaches, you can build a robust checkpoint reconciliation, password protection, control users, and read-only accessibility and create a robust, daily monitoring control process (and keeping everything crossed!) sticking with spreadsheets or you can take the route of using specifically designed vendor applications that take the pain and effort out of the continued use of legacy processes and provide auditable, risk-sensitive solutions that become fully embedded in your functional operating model.
In most instances these vendors have been where you are, they understand the need to determine how we observe data, how to handle data volumes, how to assess real-time impacts, how to control calculus integrity, how to get the most out of our empirical data and how to provide the comfort and controls that ensure best practice, fully auditable data with unquestionable integrity. These vendors continue to evolve, providing comfort and efficiencies for the betterment of the markets in which they practice just like Dan Bricklin and Bob Frankston did all that time ago.
Come and talk to us about how we help the mission-critical areas of Treasury and Trading in banks adopt robust tools and avoid over reliance on risky spreadsheets. Neil.McManus@eurobase.com