
THE TECHNOLOGY BLIND SPOT
Seventeen Companies
Seventeen companies handled the passport. The person who uploaded it found out only because they spent a weekend reading 34 pages of legal fine print that almost nobody reads. The app took three minutes. The consequences will take considerably longer to unwind.
On a Sunday afternoon in February 2026, a privacy researcher in the European Union held a passport up to a phone camera for a LinkedIn identity verification badge. The app guided the scan, captured a selfie, spun a loading icon for three seconds, and delivered a small blue checkmark. Then the researcher did something unusual: instead of closing the app, the researcher pulled up the privacy policy and terms of service for the company that actually performed the verification. Not LinkedIn. A company called Persona Identities, Inc., headquartered at 981 Mission Street in San Francisco.
Persona maintains a public list of subprocessors, third-party companies that process personal data on its behalf. The researcher counted 17 companies on that list. Sixteen are headquartered in the United States. One is in Canada. Zero are in the European Union. Three of them, Anthropic, OpenAI, and Groqcloud, are AI companies performing “Data Extraction and Analysis” on government-issued identity documents. Amazon Web Services handles “Image Processing”: passport photos and selfies flowing through Amazon’s infrastructure. A company called FingerprintJS, named with admirable directness, handles “Device Analysis.”
This is not an article about LinkedIn or one researcher’s passport. This is about the Subprocessor Blind Spot: the systematic gap between the vendor a law firm vetted and the chain of companies that actually handle the data. The firm that reviews a vendor’s SOC 2 report and negotiates a data processing agreement has completed roughly ten percent of the due diligence that matters. The other ninety percent lives in the subprocessor chain that nobody reads.
Direct Answer
Law firms that evaluate legal technology vendors without auditing the subprocessor chain are exposed to privilege compromise, jurisdictional risk, and regulatory penalties that no SOC 2 report or data processing agreement can prevent. ABA Model Rule 1.6(c) requires “reasonable efforts” to prevent unauthorized disclosure. Reasonable efforts in 2026 must include knowing who actually handles client data, not just who signed the contract.
As I detailed in “I Was Inside EMC When Hackers Stole the Keys to 40 Million Doors,” the RSA breach demonstrated this principle at nation-state scale: attackers who could not penetrate Lockheed Martin’s defenses compromised the authentication vendor Lockheed trusted. The subprocessor problem extends that lesson further down the chain. Your vendor may be secure. Your vendor’s vendors may not be.
The Scale of the Problem
I have spent more than 20 years in enterprise technology, launching products at Dell Technologies, EMC Corporation, VMware, Cisco Systems, and Huawei. In every one of those roles, I watched procurement teams evaluate the primary vendor with surgical precision and ignore the supply chain underneath. The pattern repeats in legal technology, where the stakes are considerably higher because the data at risk includes privileged communications, merger intelligence, and litigation strategy.
The numbers confirm this is not theoretical. According to insurance industry estimates, approximately 40 percent of law firms reported experiencing a security breach in 2024. The average cost of a data breach for professional services firms reached $5.08 million, according to IBM’s 2025 Cost of a Data Breach Report. Third-party vendor compromise ranked as the second most prevalent attack vector across all industries, averaging $4.91 million per incident. Supply chain breaches increased 68 percent year-over-year according to the 2024 Verizon Data Breach Investigations Report (DBIR).
Two cases illustrate the pattern. In June 2024, Kirkland & Ellis appeared as a defendant in a class-action lawsuit after the MOVEit file transfer breach exposed client data across hundreds of organizations. The vulnerability lived in MOVEit Transfer, a tool developed by Progress Software that Kirkland used for a client acquisition. Standard vendor due diligence did not reach the software component where the exploit lived. In March 2025, Berkeley Research Group suffered a ransomware attack during a $700 million leveraged buyout by TowerBrook Capital Partners. The breach exposed sensitive client data at a firm advising on bankruptcy, arbitration, and tax matters. Bloomberg’s reporting did not specify the attack vector, but the timing, during active deal execution, suggests the firm’s threat exposure extended beyond its direct security perimeter.
The common thread: the breach did not originate with the primary vendor. It originated deeper in the chain.
Anatomy of a Subprocessor Chain: The Persona Case
The following table reproduces Persona’s published subprocessor list as of September 8, 2025. Persona holds Gartner Magic Quadrant Leader and Forrester Wave Leader positions in identity verification (analyst rankings that carry inherent vendor-payment dynamics). I present it not to single out Persona but because it is one of the few companies that publishes this list transparently. Most vendors require a formal request before disclosing subprocessor information.
Seventeen companies. Sixteen in the United States. One in Canada. Zero in the EU. Every entity that processes a European passport scan for a LinkedIn verification badge falls under the jurisdiction of North American courts and the Clarifying Lawful Overseas Use of Data Act (CLOUD Act).
Persona’s reach extends well beyond LinkedIn. Reddit, Roblox, OpenAI, and Discord all use Persona for identity or age verification. The chain above does not serve one platform. It serves the identity verification infrastructure of the consumer internet.
The CLOUD Act and the Illusion of Data Residency
Persona maintains data centers in both the United States and Germany. For European attorneys, the German location sounds reassuring: a passport scan in a Frankfurt facility, governed by the GDPR, physically distant from American courtrooms.
That reassurance evaporates the moment you read the statute. Signed into law in 2018, the CLOUD Act allows US law enforcement to compel any US-incorporated company to produce data in its possession, regardless of where that data is physically stored. The blinking lights on the rack are in Germany. The legal authority over those lights is in Washington.
For law firms, this creates a privilege risk that most data residency strategies miss. If your firm uses a US-based e-discovery platform, contract lifecycle management tool, or AI-powered research service, and that vendor stores privileged material on its infrastructure, the CLOUD Act provides a mechanism for compelled production. Your data processing agreement does not override a federal warrant. As I analyzed in the Heppner AI privilege ruling, AI platforms that disclaim confidentiality in their terms of service create privilege exposure by design. When those platforms sit three layers deep in your vendor’s subprocessor chain, the exposure multiplies.
The EU-US Data Privacy Framework (DPF) offers uncertain shelter. The DPF is the third attempt by Europe and America to agree on data transfer rules. The first two agreements, Safe Harbor and Privacy Shield, fell in the European Court of Justice after challenges led by privacy advocate Max Schrems through his organization noyb. The DPF survived its first challenge in September 2025, when the EU General Court dismissed a case brought by French parliamentarian Philippe Latombe. Latombe appealed to the CJEU on October 31, 2025. Noyb has stated publicly that “a broader review of US law should yield a different result.” The DPF rests on Executive Order 14086, a presidential directive revocable without legislative action, and relies on oversight from the Privacy and Civil Liberties Oversight Board, which has been effectively paralyzed. For firms planning beyond the next fiscal quarter, the DPF is not bedrock. It is scaffolding.
The Counterargument: Why Standard Vendor Review Should Be Enough
I can already hear the objection, because I have heard it in every enterprise procurement discussion for two decades.
“Our vendor has a SOC 2 Type II report. They signed our data processing agreement. They completed our security questionnaire. We have done our due diligence.”
Every element of that objection is correct. And none of it addresses the problem. SOC 2 reports confirm that the primary vendor has implemented controls around security, availability, and confidentiality within its own environment. Data processing agreements contractually bind the vendor to specific data handling obligations. Security questionnaires surface configuration weaknesses and policy gaps. None of this is theater.
Here is what those instruments do not cover: the seventeen other companies that touch your data after the vendor receives it. A SOC 2 report for Persona tells you about Persona’s controls. It tells you nothing about whether Anthropic, OpenAI, or Groqcloud have implemented equivalent protections for the passport data routed through their systems. The Verizon DBIR found that supply chain breaches jumped 68 percent in a single year. Those breaches did not happen because primary vendors had weak SOC 2 programs. They happened because the chain behind the vendor was unaudited, unmonitored, and in many cases unknown to the customer.
Standard vendor review is necessary. It is not sufficient.
Where This Framework Falls Short
Before presenting the due diligence framework below, I should acknowledge its limits. Subprocessor chain audits are resource-intensive. A 14-attorney firm does not have a dedicated procurement team capable of evaluating 17 subprocessors for every vendor. The Persona case involves consumer identity verification, not a product designed for law firms; legal technology vendors may maintain shorter chains or stronger contractual controls. Even a thorough audit produces a point-in-time snapshot, and vendors add and remove subprocessors regularly.
These limitations are real. They do not justify doing nothing. Some vendors will refuse to disclose subprocessor information entirely. That refusal is itself a data point, and it should weigh heavily in the procurement decision. The framework below is designed to fit within existing vendor review cycles, not to require a new department.
A Due Diligence Framework for Subprocessor Chain Audits
ABA Formal Opinion 477R (2017) requires attorneys to assess the risks of transmitting client information and to use “reasonable efforts” to prevent unauthorized access. ABA Formal Opinion 483 (2018) requires procedures for monitoring and responding to data breaches. Model Rule 1.1, Comment 8 requires attorneys to “keep abreast of changes in the law and its practice, including the benefits and risks associated with relevant technology.” In 2026, relevant technology includes the subprocessor infrastructure your vendors rely on. What follows is a structured approach developed from two decades of enterprise technology procurement.
Phase 1: Subprocessor Identification
Request the complete subprocessor list from every legal technology vendor your firm uses. Under the GDPR, processors must disclose this information. For each subprocessor, document four things: company name, headquarters location, the specific service performed, and the categories of data it handles. Pay particular attention to subprocessors performing AI or machine learning functions. As I detailed in “Your AI Tool Doesn’t Keep Secrets,” consumer AI platforms routinely reserve the right to use inputs for model training unless users affirmatively opt out.
Phase 2: Jurisdictional Mapping
Map every subprocessor to its legal jurisdiction: the country of incorporation, not the country where the data center sits. A US-incorporated subprocessor is subject to the CLOUD Act regardless of server location. For firms handling cross-border matters, identify any chain where zero EU-based entities process the data. With Persona, all 17 subprocessors are North American. When the entire processing chain sits outside the EU, GDPR enforcement becomes practically difficult and CLOUD Act exposure becomes certain.
Phase 3: Data Flow Analysis
Determine what data each subprocessor actually receives. The Persona chain illustrates the fragmentation: Anthropic, OpenAI, and Groqcloud receive passport images for “Data Extraction and Analysis.” AWS handles “Image Processing.” FingerprintJS captures device telemetry. Snowflake and MongoDB store derived data in separate database services. Each subprocessor sees a different slice, but together the 17 companies reconstruct a complete identity profile that no single vendor’s privacy policy fully describes.
For law firms, the critical question is whether any subprocessor in the chain receives privileged or confidential client data. If your e-discovery vendor uses an AI subprocessor for document classification, does that AI company ingest full document text or metadata only? Does it retain inputs for model improvement? Persona’s privacy policy claims “legitimate interest” under the GDPR as its legal basis for using uploaded identity documents to train AI models. Whether that basis survives a regulatory challenge remains an open question.
Phase 4: Retention and Legal Basis Assessment
For each subprocessor handling sensitive data, evaluate the legal basis for processing and the data retention policy. Look for exception clauses. They are the fine print that swallows the rule. Persona’s policy states biometric data is deleted within six months, “unless Persona is otherwise required by law or legal process to retain the data.” Combined with the CLOUD Act, that exception could extend retention indefinitely based on a US legal process the data subject never learns about, possibly accompanied by a gag order preventing disclosure. A six-month retention policy with a law enforcement exception is not a six-month retention policy. It is an indefinite retention policy in disguise.
Phase 5: Contractual Protection Review
Review the vendor’s terms of service for liability limitations, arbitration clauses, and governing law provisions. Persona’s terms cap liability at $50 USD and require mandatory binding arbitration through the American Arbitration Association, even for European users. While GDPR statutory rights under Articles 79 through 82 cannot be waived by a Terms of Service, the practical enforcement path matters. Ensure your firm’s data processing agreements include flow-down obligations binding subprocessors to equivalent data protection standards. A contract with your vendor means little if the vendor’s subprocessors operate under weaker protections.
Each of these five phases addresses a specific failure mode documented in the cases above. Kirkland’s MOVEit exposure was an identification failure: no one mapped the software supply chain beneath the vendor. The CLOUD Act risk and the AI subprocessor training question are jurisdictional and data flow failures, respectively, both invisible to a firm that stops at the primary vendor’s SOC 2 report. The retention exception clause that turns six months into indefinite is a legal basis failure. The $50 liability cap is a contractual protection failure. The checklist that follows distills all five into questions your firm can deploy in the next vendor review cycle.
The Checklist: 15 Questions for Your Next Vendor Review
Exhibit B: Subprocessor Due Diligence Checklist
1. How many subprocessors handle data associated with our account, and where is each incorporated?
2. Which subprocessors perform AI, machine learning, or automated data analysis functions?
3. Does any subprocessor use our data for model training, product improvement, or purposes beyond direct service delivery?
4. What is the legal basis for any model training or secondary processing: consent, legitimate interest, or contractual necessity?
5. Are all subprocessors in jurisdictions with adequate data protection frameworks, or is there CLOUD Act exposure?
6. Does the vendor notify clients before adding or changing subprocessors, and does the contract provide an objection mechanism?
7. What data categories does each subprocessor receive: metadata only, document content, user behavior, or biometric data?
8. Does any subprocessor have access to attorney-client privileged material or work product?
9. What are the data retention policies for each subprocessor, and what exceptions exist for law enforcement requests?
10. Has any subprocessor experienced a data breach in the past three years, and what was the scope?
11. Does the vendor’s contract include flow-down obligations binding subprocessors to equivalent protections?
12. What liability caps apply, and do they survive GDPR statutory claims?
13. Does the vendor’s dispute resolution clause require arbitration, and in which jurisdiction?
14. Does the vendor publish a current, dated subprocessor list, and when was it last updated?
15. Can the vendor provide SOC 2 or equivalent certification for its critical subprocessors, not just for itself?
Three Minutes
That researcher spent a Sunday afternoon holding a passport up to a phone camera for a blue checkmark. Three minutes to verify. A weekend to understand what the verification actually authorized: a data trail stretching from San Francisco to San Jose to Chicago to Bozeman, Montana, through 17 companies that most users will never learn exist.
Every SaaS platform your firm uses sits atop a similar chain. The only difference between Persona and most of your legal technology vendors is that Persona publishes the list. Ask yours if they will do the same.
The next breach notification your client receives may trace back to a subprocessor neither of you knew existed.
Due diligence does not end at the contract. It begins there.
This blog provides general information for educational purposes only and does not constitute legal advice. Consult qualified counsel for advice on specific situations.
About the Author
JD Morris is Co-Founder and COO of LexAxiom. With over 20 years of enterprise technology experience and credentials including an MLS from Texas A&M, MEng from George Washington University, and dual MBAs from Columbia Business School and Berkeley Haas, JD focuses on the intersection of legal technology, cybersecurity, and professional responsibility.
Connect: LinkedIn | X | Bluesky
References
1. The Local Stack, “I Verified My LinkedIn Identity. Here’s What I Actually Handed Over,” February 16, 2026.
2. Persona IDV Privacy Policy, withpersona.com/legal/idv-privacy-policy (Last Updated: May 8, 2025).
3. Persona Subprocessors List, withpersona.com/legal/subprocessors (Last Updated: September 8, 2025).
4. Verizon, 2024 Data Breach Investigations Report.
5. IBM, Cost of a Data Breach Report 2025.
6. Clio Legal Trends Report 2024 (Clio sells practice management software; cited for independent industry data).
7. Bloomberg Law, “Kirkland & Ellis Targeted in Massive MOVEit Data Breach Lawsuit,” June 10, 2024.
8. Bloomberg, “Consulting Firm BRG Suffers Cyberattack Amid LBO Debt Sale,” March 6, 2025.
9. noyb, “EU-US Data Transfers: Time to Prepare for More Trouble to Come,” December 10, 2025.
10. WilmerHale, “European Court of Justice to Review Challenge to EU-U.S. Data Privacy Framework,” December 1, 2025.
11. Persona, “About Us” and industry analyst citations, withpersona.com (accessed February 20, 2026).
12. Embroker, “Law Firm Cyberattacks: Stats and Trends for 2025,” August 27, 2025.
13. ABA Model Rule 1.6(c), Confidentiality of Information.
14. ABA Model Rule 1.1, Comment 8, Maintaining Competence (Technology).
15. ABA Formal Opinion 477R (May 2017), Securing Communication of Protected Client Information.
16. ABA Formal Opinion 483 (October 2018), Lawyers’ Obligations After an Electronic Data Breach or Cyberattack.
17. Clarifying Lawful Overseas Use of Data Act (CLOUD Act), 18 U.S.C. § 2713 (2018).
18. Morris, JD. “I Was Inside EMC When Hackers Stole the Keys to 40 Million Doors.” Morris Legal Technology Blog, The Technology Blind Spot.
19. Morris, JD. “The Judge Used Nine Words: Why AI-Generated Legal Work Loses Privilege Protection.” Morris Legal Technology Blog, The Technology Blind Spot.
20. Morris, JD. “Your AI Tool Doesn’t Keep Secrets.” Morris Legal Technology Blog, The Technology Blind Spot.