Trump Moves to Ban Anthropic From the US Government | WIRED

Trump Moves to Ban Anthropic From the US Government | WIRED

Wired - AI 8 min read Article

Summary

President Trump orders a ban on Anthropic's AI tools for federal agencies, citing military applications and a clash over AI usage restrictions, marking a significant shift in government-tech relations.

Why It Matters

This decision highlights the ongoing tension between government interests and corporate ethics in AI development. It raises questions about the role of private companies in military applications and the implications for AI safety and governance.

Key Takeaways

  • Trump's order halts federal use of Anthropic's AI tools amid military concerns.
  • The Pentagon's designation of Anthropic as a supply chain risk limits its military contracts.
  • The conflict underscores the growing involvement of tech companies in defense work.
  • Anthropic's refusal to comply with military demands reflects broader ethical debates in AI.
  • Support for Anthropic from employees at other AI firms indicates industry divisions over military collaboration.

Save StorySave this storySave StorySave this storyUS President Donald Trump announced Friday that he was instructing every federal agency to “immediately cease” use of Anthropic’s AI tools. The move comes after Anthropic and top officials clashed for weeks over military applications of artificial intelligence."The Leftwing nut jobs at Anthropic have made a DISASTROUS MISTAKE trying to STRONG-ARM the Department of War,” Trump said in a post on Truth Social.Trump said that there would be a “six month phase out period” for agencies using Anthropic, which could allow time for further negotiations between the government and the AI startup.The Pentagon and Anthropic did not immediately respond to requests for comment.Shortly after the President’s announcement, defense secretary Pete Hegseth said that Anthropic would also be designated a “supply chain risk,” a move normally reserved for foreign businesses considered a danger to American national security. The designation will bar the US military and its contractors and suppliers from working with the AI company.Hegseth also lashed out at Anthropic and its CEO, Dario Amodei, over the company's refusal to agree to its demands. “Cloaked in the sanctimonious rhetoric of ‘effective altruism,' they have attempted to strong-arm the United States military into submission—a cowardly act of corporate virtue-signaling that places Silicon Valley ideology above American lives,” Hegseth wrote on X.The Department of Defense has sought to change...

Related Articles

Llms

Von Hammerstein’s Ghost: What a Prussian General’s Officer Typology Can Teach Us About AI Misalignment

Greetings all - I've posted mostly in r/claudecode and r/aigamedev a couple of times previously. Working with CC for personal projects re...

Reddit - Artificial Intelligence · 1 min ·
As more Americans adopt AI tools, fewer say they can trust the results | TechCrunch
Ai Safety

As more Americans adopt AI tools, fewer say they can trust the results | TechCrunch

AI adoption is rising in the U.S., but trust remains low, with most Americans concerned about transparency, regulation, and the technolog...

TechCrunch - AI · 6 min ·
Ai Safety

The state of AI safety in four fake graphs

submitted by /u/tekz [link] [comments]

Reddit - Artificial Intelligence · 1 min ·
[2603.14267] DiFlowDubber: Discrete Flow Matching for Automated Video Dubbing via Cross-Modal Alignment and Synchronization
Machine Learning

[2603.14267] DiFlowDubber: Discrete Flow Matching for Automated Video Dubbing via Cross-Modal Alignment and Synchronization

Abstract page for arXiv paper 2603.14267: DiFlowDubber: Discrete Flow Matching for Automated Video Dubbing via Cross-Modal Alignment and ...

arXiv - AI · 4 min ·
More in Ai Safety: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime