How to Detect a Fraudulent SOC 2 Report

In March 2026, Delve, a compliance automation platform, was found to have issued fraudulent SOC 2 reports on behalf of its customers. The reports traced back to audit firms with no verifiable US presence, contained copied narratives, zero exceptions across dozens of clients, and control descriptions that didn't match the companies' actual infrastructure. Some of those reports had already made it through vendor review processes at real companies. The question worth sitting with: would your process have caught them?
This post is a practical guide for the people who read SOC 2 reports as part of vendor due diligence. It covers four things to look for that distinguish a real audit from a fabricated one.
Why the SOC 2 ecosystem is vulnerable to this
SOC 2 reports are produced by private audit firms. There is no central registry of issued reports. No agency reviews them for quality. No one verifies that the auditor actually showed up.
The AICPA sets the standards and maintains a directory of licensed CPAs, but attestation engagements are largely self-policed. Compliance automation platforms accelerated the problem: they made it faster and cheaper to produce the documentation that an audit requires, which is legitimate, but they also introduced a structural conflict of interest. Many platforms work with "partner" auditors who depend on those platforms for client referrals. An auditor who finds too many issues loses the relationship. The incentive is to sign off, not to scrutinize.
The result is a market where report volume has grown significantly faster than the capacity to scrutinize what's in those reports. Most vendor review processes treat a SOC 2 as a binary: present or not present. That's the gap.
Four red flags in a SOC 2 report
1. The auditor is unverifiable
Every SOC 2 report is signed by a licensed CPA. That CPA's name is in the report. The AICPA maintains a public directory. You can look them up.
Start there. Find the signing CPA. Verify their license is active and their state of licensure matches where the firm operates. Then look at the firm itself: when was the firm registered, does it have a real address, a website with identifiable staff, any public record of attestation work? A registered address in a state with no other traceable presence is worth noting.
In the Delve case, reports traced back to firms with minimal verifiable US presence operating primarily out of India. That's not automatically disqualifying for a firm, since auditors operate across borders, but it raises the question of whether the CPA signing the report had the proximity and access to have actually conducted the engagement.
This check takes fifteen minutes. If you can't locate the signing CPA, you don't have a verified audit.
2. Copy-pasted narrative where the content should be unique
SOC 2 reports have a predictable structure. Section 4, which covers control definitions, will look similar across companies. Controls are derived from AICPA criteria, and most compliance platforms provide templated language for them. Similarity there is not a red flag.
The red flag is in Section 3, the system description, and in the auditor's test procedures.
Section 3 is supposed to describe this specific company: its architecture, its infrastructure, its team structure, its operational processes. Two companies in different industries with different tech stacks should not have the same Section 3. If they do, one of two things happened: the company copy-pasted a template and the auditor didn't notice, or the auditor copy-pasted across engagements. Either way, the audit didn't happen the way it should have.
The test procedures tell the same story. An auditor's procedures should reflect the actual work they did: what they sampled, how they tested, what they observed. When those procedures are identical across multiple reports from the same firm, the auditor is not describing independent work.
3. A pattern of zero exceptions
When an auditor tests a control, they're checking whether it worked as described during the audit period, typically six to twelve months of operations. An exception is when it didn't. The company said every code deployment required peer review. The auditor sampled twenty-five deployments and found three that went through without it. That's an exception. It happens. Real companies have late access reviews, missed patches, incomplete log retention. A SOC 2 that covers a year of operations and finds nothing imperfect is not impossible, but it warrants a closer look.
What's not plausible is an auditor who produces reports across dozens of clients and never records a single exception. That pattern doesn't indicate thoroughness. It indicates either that the auditor isn't testing or isn't reporting what they find. The question to ask isn't "does this report have exceptions?" It's "does this auditor ever find them?"
This one is harder to catch from a single report. You'd need to see multiple reports from the same auditor, which most vendor review programs aren't set up to do. But if you have access to reports from prior years for the same vendor, check whether the auditor has ever found anything. If the answer is no across multiple periods, that's worth pressing on.
4. Controls that don't match the actual technology
This is the most reliable signal and the hardest to fake.
A fabricated report, or one produced by copying another company's report, will describe controls in terms of the tools that company uses, not the tools your vendor uses. The report says code reviews happen in GitHub. The vendor uses Bitbucket. The report references CloudWatch for monitoring. The vendor runs on GCP. AWS guardrails are described in detail. The vendor's infrastructure is on Azure.
During the Delve investigation, at least two reports from different companies described GitHub-based code review controls while the auditor's own test procedures referenced Bitbucket. The subservice provider descriptions appeared in identical language across companies running entirely different architectures. The template didn't know what infrastructure the company was actually using.
You cannot catch this by reading the report in isolation. You have to cross-reference. Look at the subservice providers listed in the report and compare them against what you know about the vendor's actual stack: their public documentation, their engineering blog, the integrations you're configuring as part of onboarding. If you're in an active vendor evaluation, you likely already have visibility into their environment. Use it. A report claiming AWS while the vendor's API resolves to Google Cloud is not a subtle discrepancy. It's a lie the template can't hide.
Why questionnaires won't catch this
After Delve became public, some security teams responded by sending questionnaires asking vendors whether they had used Delve. That's understandable as an immediate reaction. It is not a verification process.
A vendor who submitted a fraudulent SOC 2 report will answer "no" to that questionnaire. Questionnaires ask whether something happened. They cannot tell you whether the controls in the report actually reflect how the company operates. They cannot tell you whether the auditor exists. They cannot tell you whether the test procedures were real.
The four checks above require reading the actual artifact, not skimming it for a Type II checkbox, but reading it with the intent to find inconsistencies. That's a different activity than most vendor review programs are currently built around.
What this means in practice
SOC 2 reports contain structured claims: about who audited the company, what controls were in place, how those controls were tested, and what the auditor found. Those claims are verifiable. Not all of them, and not always quickly, but more of them than most review processes actually check.
Treating a report as a binary artifact misses the information that distinguishes a real audit from a fabricated one. The Delve case is a clear example of what happens when that information goes unread. It won't be the last.
Key Takeaways
Check whether your vendors' SOC 2 reports hold up
OUR RESOURCES
Level up with Lema

Checkbox TPRM is Dead. Start Engineering Risk

What is a Risk Engineer?
.png)
%20(1).avif)
.png)

