{"id":392542,"date":"2026-01-27T05:26:08","date_gmt":"2026-01-27T05:26:08","guid":{"rendered":"https:\/\/www.newsbeep.com\/uk\/392542\/"},"modified":"2026-01-27T05:26:08","modified_gmt":"2026-01-27T05:26:08","slug":"humanity-needs-to-wake-up-to-dangers-of-ai-says-anthropic-chief","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/uk\/392542\/","title":{"rendered":"\u2018Humanity needs to wake up\u2019 to dangers of AI, says Anthropic chief"},"content":{"rendered":"<p>Stay informed with free updates<\/p>\n<p class=\"article__content-sign-up-topic-description o3-type-body-base\">Simply sign up to the Artificial intelligence myFT Digest &#8212; delivered directly to your inbox.<\/p>\n<p>Humanity \u201cneeds to wake up\u201d to the potentially catastrophic risks posed by powerful AI systems in the years to come, according to Anthropic boss Dario Amodei, whose company is among those pushing the frontiers of the technology.<\/p>\n<p>In a nearly 20,000-word essay, posted on Monday, Amodei sketched out the risks that could emerge if the technology develops unchecked \u2014 ranging from large-scale job losses to bioterrorism.<\/p>\n<p>\u201cHumanity is about to be handed almost unimaginable power and it is deeply unclear whether our social, political and technological systems possess the maturity to wield it,\u201d Amodei wrote.<\/p>\n<p>The essay marked a stark warning from one of the most powerful entrepreneurs in the AI industry that safeguards around AI are inadequate.<\/p>\n<p>Amodei outlines the risks that could arise with the advent of what he calls \u201cpowerful AI\u201d \u2014 systems that would be \u201cmuch more capable than any Nobel Prize winner, statesman or technologist\u201d \u2014 which he predicts is likely in the next \u201cfew years\u201d.<\/p>\n<p>Among those risks is the potential for individuals to develop biological weapons capable of killing millions or \u201cin the worst case even destroying all life on Earth\u201d.<\/p>\n<p>\u201cA disturbed loner [who] can perpetrate a school shooting, but probably can\u2019t build a nuclear weapon or release a plague\u2009.\u2009.\u2009. will now be elevated to the capability level of the PhD virologist,\u201d wrote Amodei. <\/p>\n<p>He also raises the potential of AI to \u201cgo rogue and overpower humanity\u201d or to empower authoritarians and other bad actors, leading to \u201ca global totalitarian dictatorship\u201d. <\/p>\n<p>Amodei, whose company <a href=\"https:\/\/www.ft.com\/stream\/15c0cb45-8892-46cd-a086-1d2716ae7246\" title=\"\" data-trackable=\"link\" rel=\"nofollow noopener\" target=\"_blank\">Anthropic<\/a> is the chief rival to ChatGPT maker OpenAI, has clashed with David Sacks, President Donald Trump\u2019s AI and crypto \u201ctsar\u201d, over the direction of US regulation. <\/p>\n<p>He has also likened the administration\u2019s plans to sell advanced AI chips to China to selling nuclear weapons to North Korea. <\/p>\n<p>Trump signed an executive order last month to hamper state-level efforts to regulate AI companies, and published an AI action plan last year laying out plans to accelerate US innovation. <\/p>\n<p>In <a href=\"https:\/\/www.darioamodei.com\/essay\/the-adolescence-of-technology#4-player-piano\" title=\"\" data-trackable=\"link\" rel=\"nofollow noopener\" target=\"_blank\">the essay<\/a>, Amodei warned of sweeping job losses and a \u201cconcentration of economic power\u201d and wealth in Silicon Valley as a result of AI. <\/p>\n<p>\u201cThis is the trap: AI is so powerful, such a glittering prize, that it is very difficult for human civilisation to impose any restraints on it at all,\u201d he added. <\/p>\n<p>In a veiled reference to the <a href=\"https:\/\/www.ft.com\/content\/f5ed0160-7098-4e63-88e5-8b3f70499b02\" title=\"\" data-trackable=\"link\" rel=\"nofollow noopener\" target=\"_blank\">controversy<\/a> around Elon Musk\u2019s Grok AI, Amodei wrote that \u201csome AI companies have shown a disturbing negligence towards the sexualisation of children in today\u2019s models, which makes me doubt that they\u2019ll show either the inclination or the ability to address autonomy risks in future models\u201d.<\/p>\n<p>AI safety concerns such as bioweapons, autonomous weapons and malicious state actors featured prominently in public discourse in 2023, partly driven by warnings from leaders such as Amodei. <\/p>\n<p>That year, the UK government organised an AI safety summit in Bletchley Park, where countries and labs agreed to work together to counter such risks. A successor meeting is due to be held in India in February. <\/p>\n<p>But political decisions around AI are increasingly being driven by a desire to seize the opportunities presented by the new technology rather than mitigate its risks, according to Amodei. <\/p>\n<p class=\"n-content-recommended__title o3-type-body-highlight\">Recommended<\/p>\n<p><a href=\"https:\/\/www.ft.com\/content\/3dd07583-21f7-42ec-be8f-78c58279ecc4\" data-trackable=\"image-link\" data-trackable-context-story-link=\"image-link\" tabindex=\"-1\" aria-hidden=\"true\" rel=\"nofollow noopener\" target=\"_blank\"><img decoding=\"async\" class=\"o-teaser__image\" src=\"https:\/\/www.ft.com\/__origami\/service\/image\/v2\/images\/raw\/https%3A%2F%2Fimages.ft.com%2Fv3%2Fimage%2Fraw%2Fhttps%253A%252F%252Fd1e00ek4ebabms.cloudfront.net%252Fproduction%252F5d31803f-ee42-4441-a7c6-228a9f655be2.jpg%3Fsource%3Dnext-article%26fit%3Dscale-down%26quality%3Dhighest%26width%3D700%26dpr%3D1?source=next&amp;fit=scale-down&amp;dpr=2&amp;width=240\" alt=\"Illustration of Person in the News Dario Amodei.\"\/><\/a><\/p>\n<p>\u201cThis vacillation is unfortunate, as the technology itself doesn\u2019t care about what is fashionable, and we are considerably closer to real danger in 2026 than we were in 2023,\u201d he wrote.<\/p>\n<p>Amodei was an early employee at OpenAI but left to co-found Anthropic in 2020 after clashing with Sam Altman over OpenAI\u2019s direction and AI guardrails.<\/p>\n<p>Anthropic is in talks with groups including Microsoft and Nvidia and investors including Singaporean sovereign wealth fund GIC, Coatue and Sequoia Capital about a funding round of $25bn or more, valuing the company at $350bn. <\/p>\n","protected":false},"excerpt":{"rendered":"Stay informed with free updates Simply sign up to the Artificial intelligence myFT Digest &#8212; delivered directly to&hellip;\n","protected":false},"author":2,"featured_media":392543,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[554,733,4308,86,56,54,55],"class_list":{"0":"post-392542","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-technology","12":"tag-uk","13":"tag-united-kingdom","14":"tag-unitedkingdom"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts\/392542","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/comments?post=392542"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts\/392542\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/media\/392543"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/media?parent=392542"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/categories?post=392542"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/tags?post=392542"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}