Association of Local Government Auditors' (ALGA) Quarterly Journal features Dr. Sam Gallaher

David Osborn

Association of Local Government Auditors (ALGA) Quarterly Article

Below is the article that ALGA published in their Quarterly Journal

LET THE DATA SPEAK FOR ITSELF, EARLY AND OFTEN

By Sam Gallaher

Using analytics to make timely risk-informed decisions for your annual risk assessment.



In late January, Terence Williams, Chief Audit Executive for the City of Wilmington, Delaware, and his team were planning their annual risk assessment.

Like any audit shop, a prime concern is how to most effectively collect and combine different streams of information to measure risks across the organization. His team uses a range of methods to assess organizational risk: surveys with multiple-choice questions to capture general sentiment, open-ended questions to solicit more detail, interviews with key officials, risks identified in previous audits, and records from the fraud hotline.

This approach is similar to other annual risk assessment methods I’ve experienced, including when I worked for the City and County of Denver. Naturally, to assess an organization and make a meaningful audit plan is a heavy lift. And it’s one that has its own risk: a misalignment of planned audits to actual organizational risk. A key difference this year is Auditor Williams and his team have implemented hundreds of ThirdLine’s audit analytics across more than a decade of transactions in their enterprise resource planning (ERP) system, Tyler Munis, that get updated nightly. And now they are poised to use those analytics to inform their annual risk assessment.

“How should we use it?”—as Head of Data Science for ThirdLine, this is a question I ask all the time for each analytic we build. Auditor Williams asked this question with respect to using their analytics’ results in his annual risk assessment process. It's a great question.

ERPs Enable Data-Driven Automated Annual Risk Assessments

An organization’s ERP system typically holds and governs financial and human resources transactions. Not only does it contain the accounting of these transactions in the journal entries, but it also includes the subledger systems which feed the general ledger, such as purchase orders, invoices, payment processing, and payroll. Additionally, it is within the ERP system that individuals make decisions to initiate and approve processes that expend those tax dollars to which governments are accountable. Because of this, the ERP should be a primary subject for any auditor’s annual risk assessment. Not only does it hold information about the transactions, but it holds the keys (i.e., permissions) to who can make the transactions.

Most auditors reading this article know this already. I am confident they also know that both performance and financial risks can be found within their organization’s digital systems. What I hear often is, “if only we could get the data…”. And once they have the data, “How should we use it?”



Denver’s Chief Audit Executive, Timothy O’Brien, knew this truth about the value of data, which spurred the inception of their continuous auditing program in 2016.

For example, he understood that there were instances of contracts being created just below the city council’s approval threshold. My task was to get to the data and write the analytics to measure and track it. Over time, this grew to an updatable suite of analytics across multiple systems and processes, such as purchase cards, purchase order shipping addresses, manual journal entries, short-term rental licenses, and tax returns and write-offs. From this, Denver’s audit management team could see trends in certain risks and pair it with their other risk assessment information such as interviews and surveys. Indeed, these analytics showed risks, which led to many full audit engagements with meaningful recommendations.

In a similar way, around 2007, auditors at the City of Tulsa, Oklahoma, started diving into their data systems to quantify risks.

At the time, Nathan Pickard (now ThirdLine’s Head of Product) was an auditor under now City Auditor Cathy Carter. They began with answering discrete risk questions with data. Since this time, Auditor Carter has directed her team and hired consultants to build one of the most data-driven audit shops in local government. Around 2018, Carter dedicated resources to systematically map each financial process available in their ERP system and build automated analytics based on risks identified by key stakeholders within each process. The method her team uses, which is mirrored in ThirdLine, allows them to see risks over time by major functions (e.g., purchasing, accounts payable, purchase cards, general ledger, accounts receivable, payroll, and separations of duties).

With data like this, updated automatically and tracked over time, Auditor Carter is able to perform risk assessments on the City’s ERP so efficiently they are moving to do quarterly risk assessments rather than annual. In talking with Auditor Carter, her team has continued their innovative use of analytics by implementing a truly agile process, in which auditors use the analytics results and rank each by importance to target the most risky areas shown in the data. The graphic below shows an example of a quarterly risk assessment in a single department’s account payable processes.

Returning to the Pivotal Question of Analytics in Audit: “How Do We Use It?”

On the surface, analytics are simple. Each one should answer a specific question and approximate a risk based on the data available. But it is not a simple answer on how to apply analytics in an audit and particularly in an annual risk assessment.

A few conditions need to exist before using analytics for organizational risk assessments, which I hope will help readers prepare for and apply analytics to their process.

  1. The data needs to be trusted (that old garbage-in, garbage-out argument). Auditors are great at this as they are trained to have professional skepticism and know how to assess data quality. Any analytics program should test the underlying data used in their analytics and test the results to see if the outcomes are accurate.
  2. The data needs to be relevant to the time period in question and the risk measures need to look at risk over that time. To measure is human, to trend is divine. Enough said.
  3. The analytics need to cover major processes across the organization. If your analytics do not cover your organization, I would question if they are viable to be used in an annual risk assessment. However, if you are applying analytics to your ERP system, there is a good chance they apply to all departments. In this light, ERPs are enabling better organizational risk assessments. The data is there, we just need to get to it. If you can unlock the data, then you are halfway there.
  4. The analytics need to cover entire processes. Similar to having data across organizations, the risk metrics should cover key parts of each process within the ERP. Auditors can, as highlighted by Tulsa’s example, build this understanding and risk library over time. Alternatively, ThirdLine provides analytics across most major financial and human resource processes.
  5. Whether you build or buy, your analytics will be better suited to individual audits, rather than annual risk assessments until you have the right coverage across each process and the organization. The graphic below shows an example of invoice amounts associated with high-risk versus no-risk transactions.

Once these conditions are met, the “how” question is ready to answer. Our experiences on the “how” mirror the three examples in this article, but I feel the analytics need to be treated just like another form of information. It should add to the discussion, not necessarily dominate it:

  1. Review the trends. Is a risk getting higher or lower? A risk score is meaningless without knowing its context. Ask the data if the risk is increasing or decreasing. Use the data to determine if the risk is systemic or occurring in just one department. Note areas where you see the risk increasing. Look for areas where a department is dropping in risk, but another is stagnant. The goal is to let the data show abnormal changes when compared over time and with other groups.
  2. Set internal baselines of risk. Each analytic has a false-positive rate, meaning not all transactions identified by a risk analytic are a problem. Some are higher than others. For example, Benford’s law has a high false-positive rate, while an analytic showing the same person submitting an invoice also approved it is likely to be more accurate. When these are grouped together by process, your data can tell you what “normal” looks like, and you can make a decision of what is relatively high versus low risk from there.
  3. Comparisons across organizations. With ThirdLine’s single-tenant framework, we can help you look at how your organization compares to others in specific risk areas. As noted before, every analytic has false positives and variations, so seeing a broader baseline is beneficial to understanding your relative risk. The graphic below shows an example of a comparison of risk between a single organization and the larger group on the same ERP system.

From here, we can use the more qualitative data collection to our advantage. Depending on your risk coverage in analytics, use surveys and interviews to ask about what you learned in the data analysis. The analytics results can inform your surveys or interviews. Compare the results and determine if the surveys corroborate the data. If they do, then the evidence is strong to include that topic in the next round of audits. If they do not, it can lead you to more direct and interesting questions about why a risk is seen in the data, but not perceived in the administration.

Alternatively, if you are highly confident in your coverage of your ERP, you can use your surveys and interviews to learn about risks elsewhere. In this way, the analytics are broadening your ability to review more risk with fewer resources.

I do not know if we can or should ever move away from using qualitative tools like surveys and interviews to learn about risk as they provide rich information.

However, with today’s technology and the digitization of government processes through ERP systems, auditors can and should include analytics to inform their assessments. It can add to an organization’s assessment, shed light into a typical blind spot of risk, and is extremely efficient. It paves the way to look at risk in real-time. In my opinion, let the data talk and have conversations early and often.

Association of Local Government Auditors' (ALGA) Quarterly Journal features Dr. Sam Gallaher

Download the PDF version →

Association of Local Government Auditors' (ALGA) Quarterly Journal features Dr. Sam Gallaher

To download this document, please fill out the form below.

Access your file here:

Download File
Oops! Something went wrong while submitting the form.