The accounting press has heralded the growing and transformational use of data analytics in accounting—auditing in particular. A recent poll regarding top priorities for audit leaders conducted by Gartner showed that “top focus areas for audit leaders in 2022 are attracting talent with nontraditional audit skills and effectively leveraging more advanced data analytics applications” (Gartner, “Top Priorities for Audit Leaders in 2023,” https://gtnr.it/3Q5pkp3).
“The demand for accountants with data analytics skills is growing rapidly, providing for exceptional career opportunities,” Vasarhelyi et al. noted in a 2017 CPA Journal article. The authors reported that “because the insight extracted from data can provide for a meaningful competitive advantage, organizations are allocating increasing amounts of resources to analytics initiatives.” (Miklos A. Vasarhelyi, Norbert Tschakert, Julia Kolina, and Stephen Kozlowski, “How Business Schools Can Integrate Data Analytics into the Accounting Curriculum,” The CPA Journal, October 2017, https://bit.ly/473F55L) There has been little said, however, about what firms are actually doing in this vein.
The authors sought to determine to what extent auditors have adopted data analytics and, more specifically, which tools and techniques they are using—in other words, does the buzz around data analytics in auditing match the reality? The survey consisted of two phases: The authors first interviewed several professionals working as auditors, mainly in public accounting. From those interviews, they prepared and then distributed a survey instrument to explore the extent to which auditors have been using data analytics in their practice.
One challenge was settling on the meaning of the term data analytics. A common definition is: “Analytics is the process of (and tools for) analyzing data with the objective of drawing meaningful conclusions” (Roshan Ramlukan, “How Big Data and Analytics are Transforming the Audit,” Financial Executive Annual, 2015, https://bit.ly/44H6UQ2). Although useful, there is nothing in this definition to differentiate data analytics from basic data analysis or traditional statistics. The authors attempted to distinguish between basic data analysis procedures that have been used in auditing for decades, and newer, more sophisticated techniques. Approaches that fall in the latter category include affinity analysis, classification models, and prediction models. Classification models, as the name suggests, assign entities to categories. These models can be used, for example, to place vendors into one of two possible categories, “good” or “suspicious,” based on a set of their characteristics. Prediction models, including linear regression, are distinguished from classification models in that the output is not a category, but a number. Affinity analysis is a mechanism for finding patterns in data. For example, an analysis of a company’s cash disbursements will reveal vendors considered “usual” recipients of checks or wires for amounts “typically” disbursed. Although some might argue that basic visual displays (e.g., column charts) are not truly data analytics, the survey did not distinguish between simple and sophisticated visual displays of data. However, it did ask which tools auditors used for data visualization; the use of more sophisticated tools like Tableau or Power BI may indicate more advanced displays.
More generally, for the purposes of this survey, the authors distinguished between tools and procedures. Procedures (or techniques or methods) refer to the actual analysis, such as taking a sample, calculating a summary value, or creating a chart. A tool is the software (e.g., Excel) used for implementing a procedure.
Auditor Interviews and Survey Instrument
The authors began by interviewing a wide range of audit professionals, from first-year audit staff to audit partners working for CPA firms of all sizes. We also interviewed a former head of innovation for an international audit firm. These conversations helped the authors determine which survey questions to ask, as well as to realize that a clearer definition of data analytics was needed and a distinction between tools and procedures. Furthermore, interviewees repeatedly said that data analytics, such as improved data visualization, significantly helped auditors during the initial risk assessment and planning stage of the audit. In tests of controls and substantive testing, data analytics is now often used to test whole populations rather than randomly selected samples; this automation replaces audit test procedures previously conducted manually. In order to test if this was true, the survey solicited data on tool usage for different tasks conducted as part of the audit.
The survey starts with background questions. Respondents were asked about their job titles, firm size, and industry specialization within their audit practices. This information is summarized in Exhibit 1.
Information on Survey Respondents (Total = 81)
The survey instrument broke the audit process into six distinct tasks:
- ▪ Risk assessment and audit planning
- ▪ Calculate sample size
- ▪ Data extraction
- ▪ Clean (scrub) the data
- ▪ Auditing data during substantive/detail testing
- ▪ Data visualizations
For each of the six tasks, respondents were asked when they have used analytics tools and techniques. Possible responses were “always, often, sometimes, rarely, and never.” In addition, respondents were asked which specific tool they used, giving them a set of typical choices. Respondents could select multiple tools.
The survey was posted via LinkedIn to a general audience and also within LinkedIn’s AICPA group. The communication specifically addressed the authors’ purpose in determining how public accounting firms use data analytics in their audits. Contacts were encouraged to share the survey with colleagues working for public accounting firms. Over a period of approximately four months, 81 usable responses were received.
The results were grouped into Big Four and non–Big Four respondents; the latter firms comprised those respondents that classified their firm as “middle-tier, regional or small.” Exhibit 2 shows the percentage of respondents by firm size (Big Four vs. non–Big Four) and the analytics techniques they used. Except for data visualization (the last task), usage is high by firms in both categories. Generally, Big Four firms use analytics more than the non–Big Four firms; this difference is most pronounced in the tasks to clean data, substantive/detail testing, and data visualization. The authors tested the statistical significance of the difference in proportions between Big Four and non–Big Four firms; the results are presented below each chart. Here, for example, the results for “data extraction” are statistically significant at the 0.05 level; “clean (scrub) the data” is significant at the 0.01 level.
Survey Results for Audit Tasks
The following presents the results of questions that asked respondents to identify the specific tools they or members of their audit teams used for each of the six tasks. Exhibits 3–8 display the percentages of respondents who answered that the tool was used either “always” or “often.” Note that respondents could select more than one tool, so the totals can exceed 100%.
Tools Used for Risk Assessment and Audit Planning % of respondents answering “Always” or “Often”
Risk Assessment and Audit Planning (Exhibit 3).
Of all the tasks identified, risk assessment and audit planning had the highest usage of tools, particularly for Big Four respondents. Consistent with the results throughout this survey, Excel is a favorite of all firms. The authors’ interpretation of Exhibit 3 is that firms use Excel alongside their proprietary software packages or commercially available software. It is understandable that Big Four firms are more likely to use proprietary software than smaller firms, as they have economies of scale to do this. Not only do the Big Four have greater resources to invest in the development of proprietary software, but their large client base will also give them more opportunities to use it. Almost three-quarters of the Big Four respondents indicated they use proprietary software, while roughly half of the non–Big Four respondents indicated they do. It is not clear why 54% of the non–Big Four auditors reported using proprietary software for risk assessment and audit planning; smaller firms are not likely to have the resources to develop proprietary software. One might speculate that respondents are referring to customized Excel templates or worksheets with macros that they have developed to meet their needs. About half of all the firms used commercially available software. Finally, it appears that firms use multiple tools for this task as opposed to standardizing with one tool.
Calculating Sample Size (Exhibit 4).
What is most notable for calculating sample size is that almost all Big Four firms use proprietary software for this task, more than double the percentage of non–Big Four firms. As shown in Exhibit 4, 41% of Big Four firm respondents and half of non–Big Four firm respondents use Excel for this task. This is consistent with risk assessment, where larger firms can afford proprietary software. Commercially available software is used to a much lesser extent by Big Four firms.
A data analytics approach traditionally uses all of the data—not a subset—that are available. With the advance of data analytics and the tools available to test full populations of data, only 12% of the Big Four use all transactions in place of sampling, and only 25% of the non–Big Four do. The authors surmise that these low rates are due to inertia in following traditional practice and PCAOB guidance, which states: “The justification for accepting some uncertainty arises from the relationship between such factors as the cost and time required to examine all of the data and the adverse consequences of possible erroneous decisions based on the conclusions resulting from examining only a sample of the data. If these factors do not justify the acceptance of some uncertainty, the only alternative is to examine all of the data. Since this is seldom the case, the basic concept of sampling is well established in auditing practice.” (PCAOB, “Uncertainty and Audit Sampling,” AU 350.07. https://bit.ly/3DvJ3qk)
Extracting Data (Exhibit 5).
For the extraction of data, Excel is the popular choice for respondents from all firms. As seen in Exhibit 5, this is followed by proprietary software for Big Four firms and data extraction by client for non–Big Four firms. It is perhaps surprising to see that Power BI and Tableau, which have sophisticated data extraction capabilities, were not more widely used. Perhaps auditors think of Power BI and Tableau as tools for data visualization, not data extraction.
Several comprehensive audit packages (e.g., CaseWare Idea) and general analytics packages (e.g., Power BI, Alteryx) provide easy-to-use tools that not only find and filter the needed data elements, but also combine them with data elements from other associated tables. As opposed to writing Structured Query Language (SQL) queries, these packages provide a graphical interface to create the query. Solutions to filling in missing data are elements of the package that provide a structured way of resolving this problem, instead of the typical ad hoc formulas created in Excel.
In addition to ease of use, using a standard tool to extract/clean data creates a repository for accomplishing this task that can be easily reused by successive audit teams in subsequent years. If a widely used package is adopted, an audit firm might benefit from reduced training time of new hires, as they might have experience with the package, or at least a package with similar capabilities.
Using these packages to extract data reduces the barriers to adding data visualization to the audit. Packages like Power BI or Tableau, for example, provide simple interfaces to turn data into charts or dashboards. Some have “wizards” that try to figure out what would be the best visualization of a set of data. Tools that analyze the data for trends or exceptions do exist in some packages. Examples of this would be regression or clustering. However, the tool is not configured out of the box for an audit—that is, users must configure the tool for the problem being addressed. Survey respondents indicate that this rarely happens. This is not surprising, as the necessary knowledge is not taught in a typical accounting curriculum, nor would an accounting professional be required to learn it.
Scrubbing Data (Exhibit 6).
Comments from interviews and results from surveys indicated that data cleaning is a very time-consuming process. For purposes of cleaning data, Microsoft Excel is again the tool of choice for 74 % of Big Four and 67% of non–Big Four respondents. Interestingly, non–Big Four firms are more likely to use Power BI, Tableau, or Access compared to Big Four firms. Overall, relatively smaller firms use tools other than Excel for scrubbing data as compared with Big Four firms.
Substantive/Detail Testing (Exhibit 7).
To conduct substantive/detail testing, auditors again seemed to favor Microsoft Excel. Exhibit 7 shows that 83% of Big Four and 77% of non–Big Four firms reported that they or members of their audit teams used Excel always or often. Excel may be particularly well suited to substantive testing because of its flexibility to meet the needs of whatever tests auditors develop.
Data Visualization (Exhibit 8).
As shown in Exhibit 8, the preference for Microsoft Excel is equal for both groups of firms at 63%. Big Four and non–Big Four auditors are also fairly equal in their usage of Power BI, Tableau, and proprietary software, while more non–Big Four auditors utilize commercially available software. Power BI and Tableau were designed for data visualization and offer more powerful data visualization features than Microsoft Excel—and yet, many auditor respondents who indicated they “always” or “often” use a tool still seem to prefer Excel for data visualization.
Other software tools used in analytics.
To better understand which analytics tools auditors are utilizing, the survey asked if Python, R (both widely used open-source packages/programming languages used for data analytics), or SAS (a commercial package) were used during an audit engagement. Only a few respondents indicated that they used these tools for any analytics tasks. For the relatively few respondents indicating that they used Tableau, the survey asked about the tasks Tableau was used for. Most commonly, respondents indicated data visualization, project management, and risk assessment. For Power BI, the most common responses were data analysis, data visualization, and risk assessment. In addition, the survey asked if some advanced analytics techniques (linear regression, logistic regression, K–nearest neighbors algorithm, time series forecasting, classification trees, affinity analysis) were employed in the engagement. A few respondents said that linear regression and forecasting were used, but none of the other techniques appear to be utilized.
Ripe for Growth
Some data analytics tools have found widespread use throughout the entire audit process. It is interesting to note auditors’ ongoing preference for using Microsoft Excel over individual software packages designed for specific tasks. It is also worth noting the general lack of more advanced data techniques. Although tremendous advancement has been made, it seems that some areas of the audit process, such as analytical procedures, are ripe for the application of more advanced data analysis.
The results show that Big Four firms are further along in the implementation of specific software tools and advanced analytical tools than non–Big Four firms. This might be because Big Four audit clients are more complex and more likely to be publicly traded. It might also be because Big Four firms have greater resources and economies of scale to develop and implement new techniques. A possible limitation of this survey is that technologically astute auditors are more likely to respond to surveys on LinkedIn and therefore bias the results.
Discussions about data analytics in auditing seldom distinguish “true” data analytics from the more traditional data analysis that has been featured in auditing for years. This data analysis includes, for example, sampling, computing summary statistics, and creating visual representation of data via basic charts. Data analytics typically includes more sophisticated analyses like regression and machine learning classification methods (e.g., logistic regression, K–nearest neighbors algorithm), as well as more sophisticated visual representations. In addition, a data analytics approach more likely tests the entire population of data, as opposed to a sample. Under this definition of data analytics, these results indicate that very few firms have yet to embrace data analytics, despite the overall buzz to the contrary.
Standards setters and regulators ought to further consider how to encourage auditors to use more sophisticated data analytics tools, such as auditing populations, rather than samples when appropriate. In addition, the Association to Advance Collegiate Schools of Business (AACSB) International accreditation agency has encouraged accounting-accredited education programs to help students develop technological agility—that is, thinking about how to leverage technology in order to solve problems. Educators should thus continue to integrate data analytics and technological acumen into their curricula so that future auditors will be more aware of the wide array of data analytics tools that are available to help them to take advantage of new technologies.