What sparked the DoD-Anthropic clash?

S&T – IT

5 MARCH 2026

  • The U.S. Department of Defence (styled as the Department of War under the second Donald Trump administration) has entered into a public spat with the AI firm Anthropic, which makes the Claude AI product.
  • The DoD has threatened to designate Anthropic a “supply chain risk,” dissuading a wide variety of firms that work with the U.S. government from patronising Anthropic’s products.
  • ChatGPT maker OpenAI subsequently entered into the picture, obtaining an agreement it said was not radically different from what Anthropic wanted.

Claude: AI chatbot

  • Claude is an AI chatbot that helps organisations and individual users create and modify code.
  • Its Claude Code product has been received extraordinarily well due to its capabilities.
  • Claude Code is among the few AI products that is run with extremely powerful large language models (LLMs) while also supporting on-device creation and editing of tools, once it has access to a range of software libraries to work with.
  • The product is very compelling to the defence establishment because it can iterate on high-tech weapons and defence systems.
  • Recruitment of programmers for these systems tends to be slow, as any critical weapons system is protected by several layers of secrecy, necessitating security clearances that can be time-consuming.
  • Claude Code has been a compelling proposition for the DoD, as it likely allows for iteration on programmes that drive its technology quickly.
  • While it does not execute programming tasks perfectly all the time, it performs well enough that development timelines have been shrunk in organisations that have deployed it widely, especially among experienced software developers.

Why did Anthropic clash with the DoD?

  • Anthropic was onboarded to the DoD as a part of a $200 million contract in June 2025, which allowed the U.S. government to use Claude’s services from dedicated infrastructure hosted by Amazon Web Services.
  • The issues between the firm and the DoD started on January 9, 2026, when defence secretary Pete Hegseth published a memorandum entitledAccelerating America’s Military AI Dominance,” in which he called for the elimination of “blockers to data sharing, Authorizations to Operate (ATOs), test and evaluation and certification, contracting, hiring and talent management, and other policies that inhibit rapid experimentation and fielding”.
  • Anthropic has a much-publicised “constitution” for Claude that discourages the model from supporting widespread surveillance and enabling fully autonomous weaponry.
  • Dario Amodei, the firm’s co-founder, insisted on strong language in the agreement between the DoD and Anthropic to bake in protections against domestic surveillance of U.S. residents and enable fully autonomous weaponry.
  • The firm was given until 27th February 2026 to relent and let the DoD have completely unrestricted access to its models.
  • It refused, saying in a blog post that it would help the DoD transition to a new provider.
  • The DoD then classified Anthropic as a supply chain risk, a designation usually applied to firms that have such dodgy practices that their products can provide foreign adversaries a backdoor into critical systems.
  • While this designation only disallows DoD suppliers and partners from using Claude on systems dedicated to DoD, there are concerns that executives may lean toward caution and completely remove ties with Claude.

OpenAI’s agreement

  • OpenAI negotiated an agreement with the DoD that the former claims has the same protections against surveillance and fully autonomous weaponry that Anthropic sought.
  • It is not fully clear why OpenAI was able to land this deal while Anthropic was cast out.
  • “The Department of War may use the AI System for all lawful purposes, consistent with applicable law, operational requirements, and well-established safety and oversight protocols,” a portion of the agreement made public by OpenAI said.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top