Anthropic risks pariah status after Pentagon calls it a supply-chain risk

Anthropic chief executive Dario Amodei vowed to fight the Defense Departments' designation in court, saying “we do not believe this action is legally sound.”

Anthropic PBC

runs the risk of losing a wide range of United States

government busines

s after the

Defense Department

declared it a

supply-chain risk

, a rare designation that until now has only been assigned to companies from adversary nations like China’s Huawei Technologies Co.

Such a penalty has never been imposed on an American company, let alone one at the leading edge of new technology that the government has declared a priority, according to contracting and national security specialists. The Pentagon’s decision, they said, risks setting a dangerous precedent for

companies seeking to innovate

in areas like Anthropic in

artificial intelligence

.

“Using this tool against a domestic AI firm sends a troubling signal that could chill innovation and weaken the very technology ecosystem the United States needs to stay competitive,” said Morgan Plummer, Vice President of Policy at Americans for Responsible Innovation. “These authorities were designed to keep foreign adversaries out of our supply chains, not to punish American companies for building safeguards into their technology.”

The move threatens to unravel Anthropic’s US$200 million contract to provide the Pentagon with classified AI tools, and could bar it from partnering with other companies on defence work. While that’s a fraction of the US$20 billion in revenue the firm has projected for 2026, the supply-chain risk label threatens to cast a pall over the company, whose AI tools have quickly gained favour in the corporate world.

The decision culminated weeks of tense negotiations over access to the company’s technology. Talks broke down last week after the firm demanded assurances that its AI wouldn’t be used for mass surveillance of Americans or autonomous weapons deployment, prompting President Donald Trump to order U.S. agencies to cease work with Anthropic and Defense Secretary Pete Hegseth to threaten the rarely invoked supply-chain exclusion.

To implement its finding, the Pentagon is relying on a measure known as section 3252 of the law governing the U.S. armed forces that permits the Defense Department to bar a company as a contractor if it’s found to imperil the supply chain. It defines risk as the potential that “an adversary may sabotage, maliciously introduce unwanted function, or otherwise subvert” the technology or service being provided.

The provision requires the U.S. defence secretary to follow procedural steps that include demonstrating the supply-chain risk and showing that less-intrusive measures weren’t available. Hegseth has informed Congress of his decision in letters to top Republicans and Democrats on the House and Senate committees for armed services, appropriations and intelligence, according to correspondence seen by Bloomberg.

“This determination is based in part on a risk analysis by the DoW and input from senior DoW personnel that the Covered Entity’s restrictions on the use of its products and services introduces national security risks to the DoW’s supply chain,” Hegseth wrote, referring to Anthropic and using an acronym for the Department of War, the name he now favours for the Department of Defense.

In a blog post Thursday, Anthropic chief executive Dario Amodei vowed to fight the designation in court, saying “we do not believe this action is legally sound.”

Alan Rozenshtein, a professor at the University of Minnesota Law School, said that a supply-chain risk designation under section 3252 shouldn’t even apply in Anthropic’s case. In writing the law, he said, Congress was taking aim at foreign companies to address “malware or back doors or sabotage into government systems,” not target an American business like San Francisco-based Anthropic.

“If this counted as a supply chain risk, then anytime the government disagreed with any U.S. company about any contract terms, it could call that company a supply chain risk and destroy it,” Rozenshtein said. “I don’t think that’d be constitutional.”

Trump and Hegseth have spelled out a six-month transition period to shift AI work to other providers, leaving a door open to more talks. The president often takes hard-line public stances on issues — his frequent threats on tariffs, for example — that he later softens. It’s possible that the supply-chain designation is also a negotiating tactic, aimed at forcing Anthropic to ease the conditions it’s seeking to impose on use of AI for warfare.

For now, though, talks have stalled. In his post Thursday, Amodei said he’d been holding “productive” conversations with the Pentagon regarding the company’s concerns. Yet Emil Michael, the U.S. under secretary of defence for research and engineering who had been negotiating with Amodei over the past several weeks, said in an X post late Thursday that conversations with the company were over.

“I want to end all speculation: there is there is no active @DeptofWar negotiation with @AnthropicAI,” Michael wrote.

Already, other U.S. government agencies including the Treasury Department and General Services Administration have said they’re dropping Anthropic in the wake of Trump’s order. The company’s arch-rival OpenAI announced an agreement of its own for classified Pentagon AI work hours after Trump and Hegseth demanded Anthropic’s exit from government.

Against Huawei, the U.S. government moved nearly a decade ago to declare the Shenzhen, China-based telecommunications equipment maker a supply-chain risk and bar it from government procurement, then gradually escalated restrictions with measures from agencies including the Federal Communications Commission to block it from working with any U.S. company.

While Hegseth had threatened last week to bar other military contractors or their partners from conducting “any commercial activity with Anthropic,” the Pentagon’s official designation stopped short of that. Amodei said that the statute invoked is narrowly tailored enough to keep it from affecting other Anthropic business that’s unrelated to specific Pentagon contracts.

That offered some reassurance for customers and investors who feared the company could lose the ability to do any business with companies that worked with the Pentagon. Spokespeople for Microsoft Corp. and Alphabet Inc.’s Google said their companies had concluded that they can continue to work with Anthropic on non-defence projects.

Even so, the designation means the company has to stop working with Palantir Technologies Inc., another military contractor. That includes Palantir’s use of Anthropic’s Claude in the digital mission control platform known as Maven Smart System, which has been deployed in the U.S. military’s Iran campaign.

The decision also threatens to slow a broader Pentagon effort to accelerate adoption of AI across the U.S. military. Until recently, Anthropic provided the only AI system that could operate in the Pentagon’s classified cloud, and its Claude Gov tool has become a favoured option among defence personnel for its ease of use.

“It’s a good capability” and removing it is “going to be painful for all involved,” said Lauren Kahn, a senior research analyst at Georgetown’s Center for Security and Emerging Technology. “I really think ultimately, who suffers is the war fighter.”

As the Pentagon fight escalated, Anthropic has drawn support across Silicon Valley. Tech groups representing major companies including Google and Apple Inc. have urged Trump to reconsider designating Anthropic a national security risk, arguing that would cause detrimental ripple effects for the rest of the industry.

“While this is specifically about Anthropic, the results of this are going to be broader than Anthropic because it’s sending an overall message to the AI community about how the Pentagon is viewing these products and standards,” said Jennifer Huddleston, senior fellow in technology policy at Cato Institute. “What does this say about the government’s willingness to intervene and take that choice away from these American companies?”

—With assistance from Katrina Manson, Roxana Tiron and Courtney Subramanian.

Bloomberg.com