The NATO-OpenAI Trust Gap:Why the U.S. CLOUD Act Is a Fiduciary Risk for Board Directors
Can one AI company serve both the Pentagon and a 32-nation alliance? A governance analysis of data sovereignty, cross-border warrants, and the board’s blind spot.
The Core Argument in Plain English
The issue: OpenAI is now pursuing contracts with both the U.S. Pentagon (classified networks) and NATO (unclassified networks). Under the U.S. CLOUD Act, any U.S.-based company can be compelled to hand over data regardless of where it is stored. This creates a structural conflict of interest for a 32-nation alliance built on collective sovereignty.
Why it matters to you: If your organisation uses a U.S.-based AI provider, the data is not only accessible to U.S. authorities—it may also be accessible to UK authorities under the U.S.-UK Data Access Agreement, and to other nations through bilateral treaties. The UK’s IPCO reported just under 360,000 authorisations for investigatory powers in 2023 alone.
What to do: Audit your AI supply chain for jurisdictional exposure. Ask who holds the encryption keys. Demand notification clauses. Evaluate sovereign AI alternatives for your highest-value intellectual property.
The Legal Conflict: The U.S. CLOUD Act
The Clarifying Lawful Overseas Use of Data Act (2018) is a U.S. federal law that allows U.S. authorities to compel U.S.-based companies to provide data—regardless of where that data is physically stored. If the server is in Brussels, Berlin, or Singapore, it does not matter. If the provider is incorporated in the United States, the data is within reach of U.S. law.
Think of it like this: jurisdiction follows the company, not the server. The CLOUD Act attaches to the corporate entity, not the data centre location. This is the structural problem that no amount of contractual language can solve.
For directors outside the United States, this creates what lawyers call a “conflict of laws” problem. Complying with a U.S. CLOUD Act warrant may directly violate European GDPR obligations (specifically Article 48, which restricts transfers based on foreign court orders). The board is placed in an impossible position: break U.S. law, or break EU law. There is no right answer.
In a 2026 survey by Kiteworks, one in three organisations reported experiencing a data-sovereignty-related incident in the previous 12 months. Forty-four percent of European respondents cited concerns about whether their cloud providers can genuinely guarantee data sovereignty. Forty-six percent plan to migrate to EU-based providers. [Source: Kiteworks 2026 Data Sovereignty Report]
The Pentagon Deal & the NATO Conflict
On 27 February 2026, OpenAI announced an agreement with the U.S. Department of Defense to deploy its AI models on the Pentagon’s classified network. The contract permits the military to use the technology for “all lawful purposes,” while specifying three red lines: no mass domestic surveillance, no autonomous weapons without human control, and no high-stakes automated decisions requiring human approval.
By OpenAI CEO Sam Altman’s own admission, the deal was rushed. On 3 March 2026, Altman wrote that OpenAI “shouldn’t have rushed to get this out on Friday” and acknowledged that the initial announcement “just looked opportunistic and sloppy.” The company subsequently amended the contract to add stronger anti-surveillance language.
Days later, reports emerged that OpenAI is now considering a contract to deploy on NATO’s unclassified networks. (Altman initially told staff it was for NATO’s classified networks; a spokesperson later clarified he misspoke.)
The Fiduciary Paradox
Here is the governance problem in plain English: if the U.S. government can legally compel OpenAI to hand over data under the CLOUD Act, and OpenAI is simultaneously providing AI services to 31 other sovereign nations through NATO, then one alliance member (the U.S.) has potential legal access to the intelligence workflows of every other member through a single provider.
A company cannot offer a “national security loophole” to one client while simultaneously guaranteeing absolute, unbiased neutrality to an international alliance. These two commitments are structurally incompatible.
The phrase “all lawful purposes” sounds reassuring until you consider that “lawful” is a moving target. What is lawful today can be redefined by executive order, legislative change, or classified interpretation. As the Center for Democracy & Technology noted, the contract provides no enforceable red lines beyond having a lawful purpose. The law has not caught up with AI’s capabilities for large-scale data analysis.
The “Snooper’s Charter”: UK & European Intercepts
The trust gap is not exclusively a U.S. problem. Under the UK Investigatory Powers Act 2016 (commonly called the “Snooper’s Charter”) and the U.S.-UK Data Access Agreement (authorised by the CLOUD Act), the UK government can directly request data from U.S.-based providers—bypassing traditional Mutual Legal Assistance Treaties (MLATs).
The scale of these powers is not theoretical. The UK’s Investigatory Powers Commissioner’s Office (IPCO) published its 2023 Annual Report in July 2025, revealing that just under 360,000 authorisations for investigatory powers were made in 2023, with applications rising by 9–10% annually. Over 600 public authorities in the UK hold investigatory powers.
Similar regimes operate across NATO member states:
| Jurisdiction | Authority | Reported Activity (2023–2024) |
|---|---|---|
| United Kingdom | IPCO | ~360,000 authorisations; 9–10% annual increase |
| France | CNCTR | ~23,000 individuals monitored; 25–27% increase in digital techniques |
| Germany | G10 Commission | ~90% approval rate for strategic surveillance requests |
| United States | FISA / CLOUD Act | Authorised to collect data on non-U.S. persons outside the U.S. |
| EU-wide | e-Evidence Regulation | New rules (effective 2026) allow cross-border data demands directly to providers |
The Fiduciary Conflict for Non-U.S./UK Directors
For a director in Germany, France, Italy, or any non–Five Eyes NATO country, the question is this: does adopting a U.S.-based AI platform effectively hand over your data sovereignty to a U.S.-UK legal partnership you are not party to?
Under the U.S.-UK Data Access Agreement, a warrant issued in London can reach data stored in a Virginia data centre—including data belonging to a German or French subsidiary using the same provider. And under many NATO jurisdictions, “gag orders” prevent the provider from ever telling the board that an intercept occurred.
IP Expropriation: The Risk to AI Workflows
Most boards focus on protecting personal data (PII) when they think about data sovereignty. This is necessary but insufficient. The true “crown jewels” at risk are your proprietary AI workflows: your prompts, chain-of-thought logic, model fine-tuning, and R&D outputs.
What is actually at risk (beyond personal data)?
Proprietary prompts and logic. The specific ways your organisation chains AI steps to produce results—your operational edge. If a government intercepts your AI pipeline under a national security warrant, they are not just seeing data. They are seeing your R&D methodology.
Model weights and fine-tuning. If you train or fine-tune a model on a provider’s cloud, the “learned intelligence” resides on their infrastructure. The provider can be legally compelled to decrypt and hand over those weights.
The output pipeline. State actors are not looking for names and addresses. They are looking for R&D breakthroughs, M&A strategy, resource optimisation workflows, and competitive intelligence.
If your AI provider holds the encryption keys (which most do, in order to run compute), they can be legally compelled to decrypt your workflows. To a state actor, your proprietary AI logic is a digital blueprint they can study at scale. The only structural defence is a zero-knowledge architecture where the provider physically cannot access your data—or sovereign infrastructure under your own jurisdiction.
The Sovereign Cloud Alternative
The good news for directors is that sovereign AI infrastructure is no longer theoretical. In November 2025, NATO itself signed a multi-million-dollar contract with Google Cloud to deploy sovereign, air-gapped cloud infrastructure through Google Distributed Cloud (GDC). The system is physically disconnected from the public internet, ensuring NATO’s data remains under its direct control and within NATO’s sovereign territory.
This is instructive. NATO, when it needed AI-enabled cloud for its most sensitive operations, chose an air-gapped sovereign architecture rather than relying on a standard commercial cloud deployment. The alliance is explicitly prioritising data residency, operational control, and jurisdictional isolation.
Gartner forecasts that worldwide sovereign cloud spending will reach $80 billion in 2026, with European spending growing by 83%. The European Commission adopted a Declaration for European Digital Sovereignty in November 2025, and multiple EU initiatives (the Data Act, GAIA-X, the Cloud and AI Development Act) are driving procurement requirements toward sovereign-compliant infrastructure.
For boards, the strategic question is clear: if NATO itself does not trust a standard U.S. commercial cloud deployment for classified work, should your organisation?
Fiduciary Checklist for Directors
As a board member, you should pose these four questions to your CTO and legal counsel:
“Who has the master key?”
Does your AI provider have the technical ability to decrypt your data if served with a warrant? Or do you maintain exclusive control of the encryption keys (Bring Your Own Key—BYOK)? If the provider holds the keys, your data is accessible to any government that can compel the provider.
“Which jurisdictions apply?”
If you are a German company using a U.S. provider with servers in the UK, which of those three governments can legally compel your data? The answer, under current law, may be all three. Map your jurisdictional exposure before it maps you.
“What is the notification protocol?”
Does your contract require the provider to challenge “gag orders” and notify you if a government requests your data under the CLOUD Act or Investigatory Powers Act? Many providers are contractually silent on this point, meaning you may never know your data was intercepted.
“Is there a sovereign alternative?”
For your most sensitive “crown jewel” data—AI workflows, model weights, R&D pipelines—are you using jurisdictionally isolated sovereign AI infrastructure? Or is everything on a single provider subject to the CLOUD Act?
Red Flags vs. Green Flags
| Red Flags (High Risk) | Green Flags (Governance Best Practice) |
|---|---|
| “All lawful purposes” clauses that allow the provider or government to redefine usage boundaries. | Zero-knowledge architecture: the provider cannot hand over data because they do not hold the decryption keys. |
| Sole-source dependency on one provider for both internal R&D and external defence contracts. | Multi-cloud resilience: critical infrastructure spread across different jurisdictions and providers. |
| Silent warrants: contracts that do not require the provider to challenge gag orders or notify the board. | Transparency reporting: providers that publish government data-request volumes and challenge rates. |
| Provider holds all encryption keys and can be compelled to decrypt at government request. | Confidential computing (TEE): the provider cannot see workflows even while they run in their environment. |
The AI Bioweapons Paradox: Dual-Use Risk & Board Oversight of Autonomous Defence Systems — How the same AI that accelerates drug discovery can be “flipped” to design novel pathogens, and why directors must audit dual-use risk in their AI supply chain.
References & Sources
[1] OpenAI, “Our agreement with the Department of War,” openai.com, 27 February 2026. https://openai.com/index/our-agreement-with-the-department-of-war/
[2] Fortune, “OpenAI’s Pentagon deal raises new questions about AI and surveillance,” 2 March 2026. https://fortune.com/2026/03/02/openais-pentagon-deal-raises-new-questions-about-ai-and-mass-surveillance/
[3] The Hill, “OpenAI enhances language in Pentagon AI deal,” 3 March 2026. https://thehill.com/policy/technology/5764396-openai-protections-pentagon-deal/
[4] CNBC, “OpenAI’s Altman admits defense deal ‘looked opportunistic and sloppy,’” 3 March 2026. https://www.cnbc.com/2026/03/03/openai-sam-altman-pentagon-deal-amended-surveillance-limits.html
[5] Gizmodo, “Altman Reportedly Tells Staff OpenAI Wants Another Classified Contract. This Time with NATO,” 4 March 2026. https://gizmodo.com/altman-reportedly-tells-staff-openai-wants-another-classified-contract-this-time-with-nato-2000729253
[6] Reuters via Domain-b, “OpenAI weighing contract to deploy AI on NATO networks,” 4 March 2026. https://www.domain-b.com/technology/artificial-intelligence/openai-nato-ai-contract-defense-2026
[7] IPCO, “Publication of Investigatory Powers Commissioner’s 2023 Annual Report,” 15 July 2025. https://www.ipco.org.uk/news/publication-of-investigatory-powers-commissioners-2023-annual-report/
[8] Google Cloud, “NATO and Google Cloud Sign Multi-Million Dollar Deal for AI-Enabled Sovereign Cloud,” 24 November 2025. https://www.googlecloudpresscorner.com/2025-11-24-NATO-and-Google-Cloud-Sign-Multi-Million-Dollar-Deal-for-AI-Enabled-Sovereign-Cloud
[9] TechRepublic / Kiteworks, “The Global Fight Over Who Controls Your Data Just Escalated,” February 2026. https://www.techrepublic.com/article/news-data-sovereignty-cloud-security-report/
[10] The Register, “Europe set to treble sovereign cloud investment,” 9 February 2026. https://www.theregister.com/2026/02/09/europe_sovereign_cloud_spend
[11] Atlantic Council, “Digital sovereignty: Europe’s declaration of independence?” February 2026. https://www.atlanticcouncil.org/in-depth-research-reports/report/digital-sovereignty-europes-declaration-of-independence/
[12] Axios, “OpenAI-Pentagon deal faces same safety concerns that plagued Anthropic talks,” 1 March 2026. https://www.axios.com/2026/03/01/openai-pentagon-anthropic-safety
[13] Wikipedia: CLOUD Act — https://en.wikipedia.org/wiki/CLOUD_Act (Wikidata: Q56291441)
[14] Wikipedia: NATO — https://en.wikipedia.org/wiki/NATO (Wikidata: Q7184)
[15] Wikipedia: Investigatory Powers Act 2016 — https://en.wikipedia.org/wiki/Investigatory_Powers_Act_2016 (Wikidata: Q25417782)
[16] Wikipedia: Five Eyes — https://en.wikipedia.org/wiki/Five_Eyes (Wikidata: Q379232)
[17] Wikipedia: Data sovereignty — https://en.wikipedia.org/wiki/Data_sovereignty (Wikidata: Q18603731)
© 2026 AI Board Course. This article is for educational and governance training purposes. It does not constitute legal advice. Directors should consult qualified legal counsel for jurisdiction-specific guidance.
Your Board Needs This Framework
The AI Board Course gives directors the language, frameworks, and technical literacy to lead on data sovereignty — not just defer to IT. Taught by June Lai, CFA, CPA, CMA.