US drafts strict guidelines for AI companies after Pentagon's clash with Anthropic, says report. Here's all we know

1 hour ago 2
ARTICLE AD BOX

The Trump administration has drafted strict guidelines for civilian AI contracts, mandating companies to allow all lawful uses of their models. This comes amid tensions between the Pentagon and Anthropic which refused to grant unrestricted use of its models.

The Trump administration has drafted strict guidelines for civilian AI contracts, mandating companies to allow all lawful uses of their models.
The Trump administration has drafted strict guidelines for civilian AI contracts, mandating companies to allow all lawful uses of their models.

The United States government has drawn up strict guidelines for civilian contracts with artificial intelligence companies, that would mandate these firms to allow “any lawful use” of their models by the American military, according to a report by the Financial Times.

The new rules come after the Pentagon's very public clash against Claude AI maker Anthropic over potential use of its models for automated weapons and mass domestic surveillance programmes. The battle culminated with the Department of War designating the CEO Dario Amodei-led company as a Supply Chain Risk (SCR) and barring it from all defense and associated deals.

The rules are aimed at strengthening procurement of AI services for the US government, it added. A source also told the publication that the US Department of War is mulling similar conditions for its future military contracts.

The US General Services Administration (GSA), which procures the federal government's software will be “soliciting further comments” from the industry before the new rules and enforced, the source told FT.

Trump admin draws ‘tight rules’ for civilian AI contracts

AI entities that want to do business with the US government will be required to grant the government an irrevocable license to use their models for all legal purposes, as per the report, which cited a draft of the new guidelines from the GSA.

The new rules would also require contractors to provide “a neutral, non-partisan tool that does not manipulate responses in favour of ideological dogmas such as diversity, equity, inclusion”. The phrasing draws from US President Donald Trump's executive order against alleged “woke” AI models.

The source added that the wording of a clause could also challenge compliance with the EU Digital Services Act as it mandates models to disclose whether their models have been “modified or configured to comply with any non-US federal government or commercial compliance or regulatory framework”, the report added.

Pentagon vs Anthropic: ‘Not punitive action’

US Undersecretary of War Emil Michael on a podcast said that Anthropic's Supply Chain Risk (SCR) designation came after the company failed to meet the Pentagon's requirement for “all lawful use” of technology.

Justifying the move, Michael said that the Department of War's discussions with Anthropic in the lead up to fall out, involved applications for Trump's Golden Dome proposal, scenarios of battles against Chinese hypersonic missiles and drone swarms.

Michael said that Anthropic and Amodei's ethical codes were not in alignment with the government's needs. “I need a reliable, steady partner that gives me something, that’ll work with me on autonomous, because someday it’ll be real and we’re starting to see earlier versions of that. I need someone who’s not going to wig out in the middle,” he said.

He also refused to call the move punitive, saying, “I don't view it as punitive. If their model has this policy bias, I don't want Lockheed Martin using their model to design weapons for me… I can't have that because I don't trust what the outputs may be, because they're so wedded to their own policy preferences,” he stated.

Anthropic to fight SCR designation in court

The DoW on Thursday said it has designated Anthropic as SCR and the same was confirmed by Amodei in a blog post where he said the action lacked legal basis.

“Yesterday (March 4) Anthropic received a letter from the Department of War confirming that we have been designated as a supply chain risk to America's national security. We do not believe this action is legally sound, and we see no choice but to challenge it in court,” he wrote.

Key Takeaways

  • The US government is tightening regulations on AI companies to ensure military-friendly practices.
  • Companies like Anthropic face challenges if they do not align with government requirements.
  • The guidelines may impact compliance with international regulations like the EU Digital Services Act.

About the Author

Livemint

For about a decade, Livemint—News Desk has been a credible source for authentic and timely news, and well-researched analysis on national news, busine...Read More

Get Latest real-time updates

Stay updated with the latest Trending, India , World and US news.

Business NewsNewsUs NewsUS drafts strict guidelines for AI companies after Pentagon's clash with Anthropic, says report. Here's all we know

More

Read Entire Article