Much has changed since the PCAOB began its inspection program in 2004. In just the last 10 years, the number of auditing firms registered with the PCAOB has increased by approximately 37%, 14 auditing standards have been issued, and the inspection process has matured. One percent of auditing firms (the Big Four) are responsible for the audits of U.S. issuers that account for more than 98% of global market capitalization, while 372 auditors are responsible for auditing the remaining 2% (Caleb Newquist, “The Big 4’s Stranglehold on the Audit Market Is Worse than You Thought,” July 17, 2013, Given the need to balance the PCAOB’s increasing responsibilities with its stated objective of improving audit quality through its inspection program, more resources should be devoted to inspections of this 1%. By analyzing recent PCAOB inspection trends and extracting auditing firms’ inspection reports, this article provides descriptive evidence that aligns with this expectation.

The PCAOB Inspection Process

Auditing firms that perform more than 100 issuer-audits each year are inspected annually by the PCAOB, and those that perform 100 or fewer issuer-audits are inspected at least triennially. As of Sept. 1, 2016, there were 10 annually inspected firms: Deloitte LLP, Ernst & Young LLP, KPMG LLP, and PricewaterhouseCoopers LLP (collectively, the Big Four); and BDO USA LLP, Crowe Horwath LLP, Grant Thornton LLP, MaloneBailey LLP, Marcum LLP, and RSM US LLP (collectively, the “Next Six”).

Inspection reports have two parts. The first describes the number of audits inspected and identifies inconsistencies with current auditing standards, but does not identify the specific audits inspected. Although the PCAOB has discussed moving to a random selection process, inspections are generally selected on the basis of perceived risk (Jeanette Franzel, “Progress and Evolution in Audit Oversight to Protect Investors and the Public Interest,” keynote speech at 15th Annual Baruch College Financial Reporting Conference, May 5, 2016, Consequently, audits selected for inspection are more likely to be those of issuers with larger market capitalizations, in riskier or more complex industries, or that have experienced prior accounting or auditing issues. The results of this part are available to the public after inspectors complete their work and issue their report.

The second part communicates issues about the inspected auditing firm’s systems of quality control (e.g., policies, processes, and controls over personnel hiring, training, and retention). These findings, if any, are made public only if the auditing firm fails to remediate identified deficiencies to the PCAOB’s satisfaction within one year. Of the more than 2,500 inspections performed since the program’s inception, as of October 2016, only 244 quality control reports have been released.

There is evidence that inspection results influence the actions of issuers and investors. For example, there is a positive association between PCAOB inspection findings of triennially inspected firms and auditing firm dismissals (Brian Daugherty, Denise Dickins, and Wayne Tervo, “Negative PCAOB Inspections of Triennially Inspected Auditors and Involuntary and Voluntary Client Losses,” International Journal of Auditing, November 2011, This suggests issuers may perceive inspection findings as a signal of low audit quality, as well as that issuers may not fully comprehend the nature of most inspection findings. More than 90% of findings are related to the documentation and extent of auditing procedures, while very few are associated with material misstatements of financial statements (Colleen M. Boland, Veena Brown, and Denise Dickins, “The Impact of PCAOB Inspections on Audit Standard Setting,” working paper).

Comparisons of Auditing Firms

Annually inspected and triennially inspected firms are different. The average/median number of audits of annually inspected firms was 641/296 in 2015, and the average/median audit fee was $1.6 million/$517,000. Comparable data for 362 triennially inspected firms (excluding those that only audit broker-dealers) were 7/2 and $103,000/$50,000, respectively. Issuers of triennially inspected firms are generally smaller and, measured as market capitalization, less risky. As a percentage of their practice, triennially inspected firms are also more likely to audit benefit plan issuers and small financial services issuers.

There are also differences between the Big Four and the Next Six. The average/median number of audits of the Big Four was 1,297/1,366, and the average audit fee was $1.9 million/$673,000, compared to 175/134 and $489,000/$292,000, respectively, for the Next Six. Collectively, these data reveal a lack of comparability among the auditing practices of Big Four, Next Six, and triennially inspected firms.

Comparisons of Inspection Data

To determine whether these differences impact the PCAOB’s inspection processes and outcomes, the authors examined inspection reports of the Big Four, Next Six, and a sample of 65 triennially inspected firms that responded to a survey distributed to all U.S.–based triennially inspected firms with at least one non–broker-dealer client. The sample firms are representative of all triennially inspected firms in that they are geographically dispersed, perform an average of seven audits, and have average audit fees of $147,000.

The authors compared four measures of PCAOB inspection outcomes from 2010–2014 for the Big Four and Next Six, and over the last two inspection periods for the triennial sample (covering 2009–2015): 1) the percentage of audits inspected, 2) the percentage of audits inspected with findings, 3) the percentage of audits inspected with findings resulting in a restatement, and 4) the number of quality control reports released. The results are presented in Exhibit 1.


Comparison of PCAOB Inspection Outcomes

Measure; Mean Big Four firms; Next Six firms; Triennially inspected sample firms Number of audits reviewed; 1,091; 496; 266 Percentage of audits inspected; 4%; 11%; 20% Percentage of audits inspected with findings; 36%; 44%; 35% Percentage of audits inspected with “severe” findings; 2%; 1%; 1% Number of quality control releases during time period; 8; 3; 2 Time period covered is 2010–2014 for Big Four and Next Six; 2009–2015 for triennially inspected firms

As depicted, inspections cover 4%, 11%, and 20%, respectively, of the Big Four, Next Six, and triennially inspected firms’ total audits. Given the risk-based selection process and risk of their audits, one might expect to find that larger auditing firms have more findings per audit inspected, but this is not the case. The percentage of audits inspected with findings is approximately the same for both annually inspected and triennially inspected firms (36% and 35%, respectively). There is also no difference between annually and triennially inspected firms in terms of the severity of inspection findings (2% and 1%, respectively); however, the percentage of audits inspected with findings for the Next Six (44%) is marginally higher than that of the Big Four (36%).

These inspection outcomes may be more representative of the overall practices of triennially inspected firms than they are of annually inspected firms, since tests of a larger proportion of a population may be more representative. The median number of issuers served by triennially inspected firms is two, so in many cases, the PCAOB inspects all or a majority of a firm’s audits.

The lack of variation in inspection outcomes between annually and triennially inspected firms may be attributed to several factors. Auditors may devote additional resources to high-risk audits, reducing the likelihood that insufficient auditing procedures are performed. Alternatively, audit quality may not differ between firms, or systematic differences in the inspection process may make it less likely that audit quality differences among firms will be detected.

In contrast, the average number of quality control releases for each group is meaningfully different (eight, three, and two, respectively). Of the 20 Big Four firm inspections conducted from 2007 to 2011, eight quality control reports (40%) were released between 2010 and 2014, compared with 200 out of 2,000 (10%) for triennially inspected firms. These results are not unexpected, as the complexity of managing an auditing practice likely increases with the size of the workforce (e.g., hiring, training, evaluating). Quality control releases may also be the PCAOB sending a signal to auditors and stakeholders about the importance of strong quality control practices and procedures.

Closer examination reveals that although the number of findings among the Big Four varies, trends in findings do not vary significantly year-to-year for either the Big Four or the Next Six, nor do trends differ significantly between them (see Exhibit 2). Consistent with the overall data presented in Exhibit 1, a larger percentage of Next Six audits inspected have findings (between 40% and 51%) than do Big Four audits inspected (between 32% and 40%).


Summary of Inspection Findings of the Big Four and Next Six

Number of issuers inspected with findings (percentage of issuers inspected with findings) 2010; 2011; 2012; 2013; 2014 Deloitte; 26 (46%); 22 (42%); 13 (25%); 15 (28%); 11 (21%) E&Y; 13 (21%); 20 (36%); 25 (49%); 28 (50%); 20 (36%) KPMG; 12 (23%); 12 (23%); 17 (35%); 23 (48%); 28 (55%) PwC; 26 (39%); 26 (43%); 21 (40%); 19 (33%); 17 (30%) Big Four average; 19 (32%); 20 (36%); 19 (37%); 21 (40%); 19 (35%) “Next Six” average; 8 (41%); 8 (40%); 10 (49%); 9 (44%); 9 (51%)

Beyond inspection outcomes, the authors expect the PCAOB to devote more time and resources to inspections of the Big Four; the data confirm this expectation. For inspection year 2014, inspectors spent, on average, 426 days inspecting each of the Big Four (seven days per audit), compared to 87 days for each of the Next Six (five days per audit) and four for the triennial firms (two days per audit). Conversations with two Big Four audit partners describe the inspection process as generally starting with a meeting between representatives of the firm and the PCAOB during the fourth quarter preceding the year to be inspected. In contrast, anecdotal comments and a triennially inspected firm’s response to a PCAOB inspection report suggest the PCAOB occasionally performs “desk reviews” (i.e., remote inspection of electronic files) of triennially inspected firms (Brian Daugherty and Wayne Tervo, “PCAOB Inspections of Smaller CPA Firms: The Perspective of Inspected Firms,” Accounting Horizons, June 2010,

Also of interest is the lag between the last day of inspection and the release date of the inspection report. In 2014, the median lags for the Big Four, Next Six, and the sample of triennially inspected firms were 206 days, 355 days, and 24 days, respectively. These differences may reflect variation in the complexity of the audits; alternatively, they may reflect the negotiating power of annually inspected firms or differences in the inspection process and reporting—the latter of which is investigated below.

Example Inspection Findings and Analyses

The PCAOB periodically publishes reports about its inspection process that describe findings common among many auditing firms, as well as areas intended for future inspections (e.g., PCAOB Release 2015-007; PCAOB Staff Inspection Bulletins 2016/1, 2016/3). These areas include the auditing of accounting estimates (e.g., business combinations, impairment of intangibles, financial instruments, revenue-related estimates and reserves, the allowance for loan losses, inventory reserves, and tax-related estimates) and the appropriate application of—

  • Auditing Standard (AS) 5, An Audit of Internal Control over Financial Reporting That Is Integrated with an Audit of Financial Statements;
  • AS 13, The Auditor’s Responses to the Risks of Material Misstatement;
  • AS 14, Evaluating Audit Results; and
  • AS 15, Audit Evidence.

A search of inspection reports of annually inspected firms covering audit years 2010–2014 and of the sample triennially inspected firms covering audit years 2009–2015 for findings related to these auditing standards identified example findings for each standard from an annually inspected firm (three Big Four and three Next Six) and six different trienni-ally inspected firms, which were examined for qualitative differences. A summary of the search is presented below; a full presentation of the findings will be included with this article online at

The authors’ analysis sug gests that findings of annually inspected firms are more descriptive and have greater specificity than those of triennially inspected firms.

The authors’ analysis suggests that findings of annually inspected firms are more descriptive and have greater specificity than those of triennially inspected firms. For instance, a finding associated with noncompliance with AS 13 of a triennially inspected firm merely states that the firm failed “to perform sufficient procedures to test the valuation of available-for-sale investment securities.” A comparable finding from an annually inspected firm states that the firm:

Tested the valuation of the majority of the issuer’s investments in securities at an interim date; however, the Firm failed to perform sufficient procedures to provide a reasonable basis for extending its conclusions on the valuation of those securities to the balance sheet date. Specifically, with respect to equity securities, the analytical procedures that the Firm performed to roll forward its conclusions to year-end consisted only of determining that the issuer’s returns, by industry sector, were directionally consistent with publicly reported returns for the industry sector, and the publicly reported returns used in the comparison were for a period that was longer than the roll-forward period. For debt securities, the Firm compared the issuer’s yield for its debt portfolio to corresponding publicly available yield rates, but the publicly available information used in the comparison were for a period that was longer than the roll-forward period. In addition, in this analysis, the Firm failed to incorporate the effects of the issuer’s purchases and sales of debt securities during the roll-forward period.

The use of linguistic techniques also confirms differences in the findings of annually and triennially inspected firms. Annually inspected firms’ findings are more complex (word count of 623, compared to 139 for triennially inspected firms) (Joshua J. Filzen and Kyle Peterson, “Financial Statement Complexity and Meeting Analysts’Expectations,” Contemporary Accounting Research, Winter 2015, Measuring “tone” using Philip J. Stone’s General Inquirer word list and software, the example annually inspected firms’ findings are also significantly more negative (3% of total words are negative) than the findings of the triennially inspected firms (1%). It is possible that PCAOB inspectors evaluate annually inspected firms more stringently due to the riskier nature of their audits.

Of note, the auditing firm Marcum LLP moved from being a triennially inspected firm to an annually inspected firm in 2015 by increasing its issuer audits from 99 to 102. A comparison of Marcum LLP’s last inspection report as a triennially inspected firm (March 28, 2013) to its first as an annually inspected firm (June 16, 2016) reveals no meaningful differences. For audit year 2012, 11 audits were inspected, which is consistent with the inspection rate of the Next Six; of those inspected, none had findings. For audit year 2015, eight audits were inspected, and none had findings. There were also no apparent differences in terms of the reports’ complexity or tone. Since an increase of three audits likely did not impact the structure of Marcum LLP, this is not surprising.

Final Analysis

Compared to triennially inspected firms, the audit practices of the Big Four are meaningfully more risky and complex. The trends and analyses presented here provide evidence that the PCAOB’s inspection program reflects differences in the profiles of its registered auditing firms.

In terms of two inspection outcomes (the percentage of audits with findings, and the percentage of audits with “severe” findings), there is no measurable difference between annually and trienni-ally inspected firms. This is somewhat unexpected in light of the PCAOB’s focus on higher-risk audits; possible explanations include the following: annually inspected firms devote additional resources to high-risk audits, audit quality does not differ between annually and triennially inspected firms, or systematic differences in the inspection process make it less likely that audit quality differences among annually and triennially inspected firms will be detected.

There are, however, meaningful differences when comparing quality control releases of the Big Four and triennially inspected firms: the Big Four have had a larger percentage of their quality control reports released. The time devoted to Big Four and Next Six inspections and reports is also meaningfully longer than that of triennially inspected firms. These results may reflect the additional operating complexity of larger auditing firms or the PCAOB’s desire to signal the importance of strong systems of quality control practices and procedures. Finally, the inspection reports of annually inspected firms are more complex and negative in tone than those of triennially inspected firms.

There are meaningful differences when comparing quality control releases of the Big Four and triennially inspected firms.

Before the Sarbanes-Oxley Act of 2002 (SOX), the market typically reacted negatively if an issuer replaced its large auditing firm with a small firm (John Dunn, David Hillier, and Andrew P. Marshall, “The Market Reaction to Auditor Resignations,” Accounting and Business Research, March 1999,; Hsihui Chang, C. S. Agnes Cheng, and Kenneth J. Reichelt, “Market Reaction to Auditor Switching from Big Four to Third-tier Small Accounting Firms,” Auditing, November 2010, Many attributed this effect to perceived differences in audit quality among large and small auditing firms. SOX modified the relationship between issuers and their auditors (e.g., the auditor must report to an independent audit committee, and most nonaudit services are prohibited) and created the PCAOB. Since SOX, the results of at least two studies suggest that stock price penalty for down-tier auditor changes may be a thing of the past (Denise Dickins, “Should Congress Mandate Audit Firm Rotation?” Regulation, Winter 2006,; Chang et al. 2010). Consistent with the descriptive analyses presented here, the market may perceive that the PCAOB’s inspection process and other SOX-mandated systematic changes are working to maintain a more consistent level of audit quality among auditors and issuers.

Colleen M. Boland, PhD, CPA is an assistant professor at the University of Wisconsin–Milwaukee, Milwaukee, Wis.
Brian Daugherty, PhD, CPA is an assistant professor at the University of Wisconsin–Milwaukee, Milwaukee, Wis.
Denise Dickins, PhD, CPA, CIA is an associate professor at East Carolina University, Greenville, N.C.
Anna J. Johnson-Snyder, PhD, CPA, CFE is an assistant professor at Bradley University, Peoria, Ill.