The majority of ThirdLine's co-founders began their careers in auditing or analytics, where we each discovered for ourselves the conspicuous limitations of the tools that still dominate the space today, along with other obstacles that can unfortunately come to define the work itself.
The most popular tools available to local government auditors have limited capability. The survey in the report found that the primary tool used is Microsoft Excel, followed by Galvanize / ACL, Idea, and Power BI.
Few shops have someone able to write code. As they say in the movie Top Gun, “It's Not The Plane, It's The Pilot.” There are many free tools, but it is difficult to find analysts to write the code for these tools.
The most powerful tools are only used by a handful of audit shops. Analytic tools like Python are not common in audit departments, but are some of the most powerful open source analytic tools.
All analytic tools require hefty infrastructure development!
The report found that possession of both government auditing skills and analytics skills is very rare; those who do indeed boast a unique skill set. That is why even when an analytics team is built up, there is always the risk of employee turnover that results in significant loss to the team.
A lack of truly incorporating analytics into the workflow process – How do you actually create analytics for more than the sake of saying you “do analytics?”
Organizational risk and cause/effect analysis are not used much.
Very few have found ways to automate analytics and feedback into their processes.
Our prior experiences at Fortune 500 companies, local governments, and public accounting firms bore out the above findings, but also lent us first-hand knowledge of additional problems that hindered efficient and thorough work in analytics for auditing, performance metrics, and financial reporting. The following are 10 issues we’ve run into as we’ve set up analytic teams in finance, accounting, IT, and internal audit departments:
There is often an IT person doing their job well who is the gatekeeper and protector of the ERP data. In our lives in governments and corporations, we have experienced waiting up to one year to get our hands on this data. Accessing the ERP data has been as easy as bringing cupcakes to IT, and as hard as waiving the Audit Charter that states auditors get access to the ERP data.
Your ERP system may have thousands of database tables. It will also contain “Views,” but it is not best practice to use these tables as they may have mistakes. Hopefully, your ERP comes with a solid Data Dictionary User Manual to understand how the tables work together so you have this knowledge. (Our best advice: start with one process area, like Accounts Payable, and understand everything there is about one function.)
Once you get access to the data and start to understand the tables, how do you get a Data Pipeline that can deliver this data to you on a nightly, weekly, monthly, or quarterly basis? As you could have millions of rows of data, you will likely either have to understand APIs or automate SQL jobs that dump the data in a folder for you. (It is NOT best practice to have a live connection to the Production database. We have seen analysts bring down an entire organization's ERP systems when they hooked into the Live Production Database. There are also many tools that can help with this like CData or SnapLogic.)
Once you are receiving the data on a frequent basis, how have you enabled your tools to automatically refresh all of your analytics? Pulling new data into Excel is not a maintainable and scalable solution. Some options include setting up automatic refreshes in cloud-based tools like Power BI or Tableau, or in analytic tools like Python or Arbutus.
Analysts with an understanding of audit or accounting are difficult to find and in high demand. We have experienced being the most technical person on our teams and thus stretched between too many projects and not having time to focus on making one thing great. In some instances, you may have an internal auditor who is technical and only gets to spend half of her time on analytic work.
We have seen many organizations spend years building up an analytics program, only to have the entire analytics team leave within a couple years, thereby bringing an organization back to ground zero.
It is difficult for non-technical people to understand if an “analyst” is actually good at what they do. Any analyst with less than 5 years experience will likely produce data with errors, omissions, and may be weak on commenting in their code. Solution: analysts are like turtle doves, get 2 of them, or at least have an experienced contractor that can pair with your solo analyst to help him.
In the software world, the User Interface and User Experience (UI/UX) people and Product Managers are extremely important. These two roles help pinpoint end user needs. But in the analyst world where those roles do not exist, we often find that analysts are expected to stand in the gap, though they are not usually trained in eliciting that which would inform end user needs. (We have been those analysts! Individuals on our team have been guilty of creating many dashboards that no one used. We eventually adopted an Agile approach to analytics and wrote User Stories for every single analytic that we built, so everything we made was extremely specific in what it did and who it helped.)
Some organizations are touting that “Dashboards are dead.” We do not go this far, but as a team that has used Power BI and Tableau since they were released, we eventually hit a wall in their capabilities. The two big barriers we encountered were an inability to get “feedback loops” from the user of the dashboard, and an inability to actually have a workflow. After all, dashboards are meant to be a dynamic report, not something that can receive data back from users. (The real power in analytics comes when your end users actually tell you that what you uncovered represents cost savings for them or entails fraud/otherwise inappropriate spending. Moreover, the input from end users has the ability to feed data science models.)
Historically, audit writes a report to tell management to monitor without providing any help on how to do that. (The newer Three Lines Model focuses more on collaboration and communication. Audit and management of organizations should collaborate in an appropriate way. Otherwise duplicate work is done. Bonus for auditors: dashboards also limit the ability to know if management is actually monitoring, or if they just have access to a dashboard.)
Download the PDF version →
To download this document, please fill out the form below.
Access your file here:Download File