{"id":463421,"date":"2026-03-08T00:10:07","date_gmt":"2026-03-08T00:10:07","guid":{"rendered":"https:\/\/www.newsbeep.com\/uk\/463421\/"},"modified":"2026-03-08T00:10:07","modified_gmt":"2026-03-08T00:10:07","slug":"what-does-the-us-militarys-feud-with-anthropic-mean-for-ai-used-in-war-ai-artificial-intelligence","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/uk\/463421\/","title":{"rendered":"What does the US military\u2019s feud with Anthropic mean for AI used in war? | AI (artificial intelligence)"},"content":{"rendered":"<p class=\"dcr-130mj7b\">Anthropic\u2019s ongoing fight with the Department of Defense over what <a href=\"https:\/\/www.theguardian.com\/us-news\/2026\/feb\/26\/anthropic-pentagon-claude\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">safety restrictions<\/a> it can put on its <a href=\"https:\/\/www.theguardian.com\/technology\/artificialintelligenceai\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">artificial intelligence<\/a> models has captivated the tech industry, acting as a test of how AI may be used in war and the government\u2019s power to coerce companies to meet its demands.<\/p>\n<p class=\"dcr-130mj7b\">The negotiations have revolved around Anthropic\u2019s refusal to allow the federal government to <a href=\"https:\/\/www.theguardian.com\/us-news\/2026\/feb\/24\/anthropic-claude-military-ai\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">use its Claude AI<\/a> for domestic mass surveillance or autonomous weapons systems, but the dispute also reflects the messy nature of what happens when tech companies have their products integrated into conflict. The Pentagon this week declared Anthropic a supply chain risk for its refusal to agree to the government\u2019s terms, while Anthropic has vowed to challenge the designation in court.<\/p>\n<p class=\"dcr-130mj7b\">The Guardian spoke with Sarah Kreps, a professor and director of the Tech Policy Institute at Cornell University who previously served in the United States air force, about how the feud has played out.<\/p>\n<p class=\"dcr-130mj7b\">You\u2019ve worked for a while on problems around \u201cdual use technology\u201d. What happens when there\u2019s a consumer technology that also gets used for classified or military purposes?<\/p>\n<p class=\"dcr-130mj7b\">I\u2019ve thought about this a lot because I was in the military and I was on the side of the military that was developing and acquiring new technologies. We were always getting criticism about why it was taking so long, and now watching what\u2019s happening I realize why it takes so long.<\/p>\n<p class=\"dcr-130mj7b\">What you would develop for classified and military contexts is very different from what Anthropic has developed for when I use Claude. The challenge for the military is that these technologies are so useful they can\u2019t wait until a military grade version is available. They need to act quickly because of how valuable these tools are, but it\u2019s not surprising that they ran into cultural differences between not just an AI platform and the military, but an AI platform that has tried to cultivate a reputation as being more safety conscious.<\/p>\n<p class=\"dcr-130mj7b\">One element in this feud is that Anthropic has branded itself as a safety-forward company, but then it did sign onto a deal with the military. <\/p>\n<p class=\"dcr-130mj7b\">Yes, there is a way in which it\u2019s surprising that Anthropic would be surprised by where this ended up. Part of the challenge is that Anthropic seems to have made the decision a year or two ago that <a href=\"https:\/\/www.theguardian.com\/technology\/chatgpt\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">ChatGPT<\/a> was going to be for individual users and Anthropic was going to try to corner the enterprise market. That means they\u2019re trying to do business with organizations, rather than trying to sell individual plans.<\/p>\n<p class=\"dcr-130mj7b\">The puzzle to me is that they were then doing business with the Pentagon and Palantir, which is in the business of using AI for what some people would say are questionable purposes. So that decision was surprising to me because it was very much at odds with the brand that Anthropic was trying to curate.<\/p>\n<p class=\"dcr-130mj7b\">It seems like that Anthropic was OK with a pretty wide use of its technology, but that there was a red line that they got to with domestic mass surveillance and lethal autonomous weapons.<\/p>\n<p class=\"dcr-130mj7b\">There are a couple of possibilities. One is that some of this had to do with relationships between the people in Anthropic and the <a href=\"https:\/\/www.theguardian.com\/us-news\/trump-administration\" data-link-name=\"in body link\" data-component=\"auto-linked-tag\" rel=\"nofollow noopener\" target=\"_blank\">Trump administration<\/a>, which led to a downward spiral of distrust.<\/p>\n<p class=\"dcr-130mj7b\">Second, there was the situation in <a href=\"https:\/\/www.theguardian.com\/world\/venezuela\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">Venezuela<\/a> and then the politics around ICE activities. There is this question of what does it actually mean to be using these technologies lawfully? One person\u2019s definition of lawful might look very different from another\u2019s.<\/p>\n<p class=\"dcr-130mj7b\">The Pentagon\u2019s argument was, in part, that if there\u2019s a national defense issue we shouldn\u2019t have to call up Dario Amodei to get approval. It does seem like there is an actual question here around what role private tech companies have in national security decision-making.<\/p>\n<p class=\"dcr-130mj7b\">If you recall the case of the <a href=\"https:\/\/www.theguardian.com\/technology\/2016\/feb\/17\/inside-the-fbis-encryption-battle-with-apple\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">San Bernardino killer\u2019s iPhone<\/a>, authorities were worried that this was a ticking bomb situation and they needed Apple to get into the phone. [In 2016, the FBI <a href=\"https:\/\/www.theguardian.com\/us-news\/2016\/feb\/17\/apple-ordered-to-hack-iphone-of-san-bernardino-shooter-for-fbi\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">demanded<\/a> Apple create a backdoor to grant them access to a mass shooter\u2019s phone. Apple <a href=\"https:\/\/www.theguardian.com\/technology\/2016\/feb\/17\/apple-challenges-chilling-demand-decrypt-san-bernadino-iphone\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">refused<\/a> on privacy grounds, resulting in the FBI seeking out an independent third party to hack into the device].<\/p>\n<p class=\"dcr-130mj7b\">The difference here with Anthropic\u2019s AI is that once you hand this over to the military, you no longer need Anthropic\u2019s approval to use it as you see fit. It\u2019s the difference between hardware and software. You can repurpose this software and use it in ways that maybe weren\u2019t part of the explicit agreement, but now you can justify it on the basis of national security. Then Anthropic has lost all its leverage because it\u2019s in the hands of these national security professionals.<\/p>\n<p class=\"dcr-130mj7b\">And Anthropic wouldn\u2019t be able to tell what it\u2019s even being used for, correct?<\/p>\n<p class=\"dcr-130mj7b\">Yeah, exactly right. It goes into not just a black box, but Black Ops and classified systems that are closed off.<\/p>\n<p class=\"dcr-130mj7b\">I\u2019ve found it interesting this week that it seems like a lot of really longstanding questions on AI use in the military are coming to a head. You\u2019ve been following these issues for a long time, what are you thinking about watching this current fight?<\/p>\n<p class=\"dcr-130mj7b\">When I would hear the CEO of Anthropic talk, he would talk about these existential risks and the misappropriation of AI for bioterrorism. I always thought that those were either too distant or too out of reach. I thought this sort of more mundane case was more of a risk.<\/p>\n<p class=\"dcr-130mj7b\">There have also been people for a long time foreshadowing these questions about autonomous weapons. The challenge is how do you ever know whether there\u2019s actually a human in the loop. This was a concern that Anthropic had \u2013 how do we know if these systems are being used in a fully autonomous way? The US says we are not going to use AI in a fully autonomous capacity, but it\u2019s not clear what that process looks like for ensuring that doesn\u2019t happen. This was some time coming, but I guess it was sort of inevitable that we would go in that direction, just because the technology has gotten more and more sophisticated. The fact of now being involved in a conflict just kind of accelerates those timelines.<\/p>\n<p class=\"dcr-130mj7b\">We talk a lot about threats from AI and these red lines that people backed away from, but how is AI already being used in warfare?<\/p>\n<p class=\"dcr-130mj7b\">You can see how it\u2019s extremely useful in a military setting. I did some work on the intel side and one of the challenges is not the lack of content, it\u2019s the signal to noise ratio. You have a huge volume of information but it can be really hard to connect the dots, and that\u2019s something that AI is so good at. You feed it large amounts of information and it generates outputs that help identify what the signal is.<\/p>\n<p class=\"dcr-130mj7b\">If you\u2019re looking for pattern recognition, AI is really good at pattern recognition. You can identify what the kind of correlates or characteristics are that you\u2019re looking for and then it can go out and identify things, say an Iranian naval vessel, based on what you\u2019ve programmed it to identify. That\u2019s not been super controversial in some ways, because those targets are fairly concrete.<\/p>\n<p class=\"dcr-130mj7b\">Where people get more uncomfortable is in a setting where the US, for example, would do counter-terrorism strikes. You have an individual on the ground that doesn\u2019t have a lot of identifiable characteristics and so that is a much more precarious situation for AI where you\u2019d really want to make sure you\u2019re triple-checking. He could be a combatant, he could be a civilian. It\u2019s not a naval vessel or surface to air missile, where it\u2019s harder to get that wrong.<\/p>\n","protected":false},"excerpt":{"rendered":"Anthropic\u2019s ongoing fight with the Department of Defense over what safety restrictions it can put on its artificial&hellip;\n","protected":false},"author":2,"featured_media":463422,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[554,733,4308,86,56,54,55],"class_list":{"0":"post-463421","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-technology","12":"tag-uk","13":"tag-united-kingdom","14":"tag-unitedkingdom"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts\/463421","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/comments?post=463421"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts\/463421\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/media\/463422"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/media?parent=463421"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/categories?post=463421"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/tags?post=463421"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}