Microsoft threw its support behind Anthropic on Tuesday, saying a judge should issue a restraining order that would block the Pentagon's designation of the artificial intelligence giant as a supply chain risk "for all existing contracts."
Such an order would "enable a more orderly transition and avoid disrupting the American military's ongoing use of advanced AI," Microsoft said in a filing in U.S. District Court in San Francisco. Without that order, Microsoft warned that it, along with other technology companies, would need to "act immediately to alter existing product and contract configurations" used by the Defense Department.
"This could potentially hamper U.S. warfighters at a critical point in time," the filing said.
Last week, the DOD officially banned Anthropic's technology and deemed the company a supply chain risk, a label that's historically been reserved for foreign adversaries. The designation, which was effective immediately, will require defense vendors and contractors to certify that they don't use Anthropic's models in their work with the Pentagon.
Anthropic sued the Trump administration on Monday, calling the government's actions "unprecedented and unlawful," and claiming that they are "harming Anthropic irreparably," putting hundreds of millions of dollars worth of contracts in jeopardy in the near term.
Microsoft's comments on Tuesday appeared in a motion for a proposed amicus brief with the court. Amicus brief filings are submitted by parties that are not named in a given case, but that have relevant expertise or will be affected by the outcome.
In November, Microsoft announced plans to invest up to $5 billion in Anthropic. The company has also been a major investor in its rival OpenAI since 2019.