US judge blocks Pentagon’s Anthropic blacklisting for now

Wait 5 sec.

3 min readMar 27, 2026 09:42 AM ISTAnthropic's designation was the first time a U.S. company has been publicly designated ⁠a supply-chain risk under ⁠an obscure government-procurement statute aimed at protecting military systems from foreign sabotage. (Image: Reuters)A U.S. judge on Thursday temporarily blocked the Pentagon’s blacklisting of Anthropic, the latest turn in the Claude maker’s high-stakes fight with the military over AI safety on the battlefield. Anthropic’s lawsuit in California federal court alleges that Defense Secretary Pete Hegseth overstepped his authority when he designated Anthropic a national security supply-chain risk, a label the government can apply to companies that expose military systems to potential infiltration or sabotage by adversaries.Hegseth’s unprecedented move, which followed Anthropic’s refusal ⁠to ​allow the military to use AI chatbot Claude for U.S. surveillance or autonomous weapons, blocked Anthropic from certain military contracts. Anthropic executives have said it could cost the company billions of dollars in lost business and reputational harm.Anthropic says that AI models are ​not ​reliable enough to be safely used in autonomous weapons ⁠and that it opposes domestic surveillance as a violation of rights, but the Pentagon says private companies should not be able ‌to constrain military action.U.S. District Judge Rita Lin, an appointee of former Democratic President Joe Biden, handed down the ruling at a hearing in San Francisco after Anthropic asked for a temporary order blocking the designation while the litigation plays out. Lin’s ruling is not final, and the case is still pending.Anthropic’s designation was the first time a U.S. company has been publicly designated ⁠a supply-chain risk under ⁠an obscure government-procurement statute aimed at protecting military systems from foreign sabotage.In its March 9 lawsuit, Anthropic alleged the government violated ⁠its right to ‌free speech under the First Amendment of the Constitution by ​retaliating against its views on AI safety. The company ‌said it was not given a chance to dispute the designation, in violation of its Fifth Amendment right to due process.The lawsuit says ‌the decision was unlawful, ​unsupported by facts ​and inconsistent ​with the military’s past praise of Claude.Story continues below this adThe Justice Department countered that Anthropic’s refusal to lift the restrictions could cause uncertainty ​in the Pentagon over how it could use Claude and ⁠risk disabling military systems during operations, according to a court filing.The government said the designation stemmed from Anthropic’s refusal to accept contractual terms, not its views on ‌AI safety. Anthropic ⁠has a second lawsuit pending in Washington, D.C., over a separate Pentagon supply-chain risk designation that could lead to its exclusion ​from civilian government contracts. Tags:Anthropic