{"id":394880,"date":"2026-01-06T07:47:07","date_gmt":"2026-01-06T07:47:07","guid":{"rendered":"https:\/\/www.newsbeep.com\/au\/394880\/"},"modified":"2026-01-06T07:47:07","modified_gmt":"2026-01-06T07:47:07","slug":"leading-ai-expert-delays-timeline-for-its-possible-destruction-of-humanity-ai-artificial-intelligence","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/au\/394880\/","title":{"rendered":"Leading AI expert delays timeline for its possible destruction of humanity | AI (artificial intelligence)"},"content":{"rendered":"<p class=\"dcr-130mj7b\">A leading artificial intelligence expert has <a href=\"https:\/\/www.lesswrong.com\/posts\/YABG5JmztGGPwNFq2\/ai-futures-timelines-and-takeoff-model-dec-2025-update\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">rolled<\/a> <a href=\"https:\/\/x.com\/DKokotajlo\/status\/1991564542103662729\" data-link-name=\"in body link\" rel=\"nofollow\">back<\/a> his timeline for AI doom, saying it will take longer than he initially predicted for AI systems to be able to code autonomously and thus speed their own development toward superintelligence.<\/p>\n<p class=\"dcr-130mj7b\">Daniel Kokotajlo, a former employee of OpenAI, sparked an energetic debate in April by releasing <a href=\"https:\/\/ai-2027.com\/\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">AI 2027<\/a>, a scenario that envisions unchecked AI development leading to the creation of a superintelligence, which \u2013 after outfoxing world leaders \u2013 destroys humanity.<\/p>\n<p class=\"dcr-130mj7b\">The scenario rapidly won <a href=\"https:\/\/intelligence.org\/2025\/04\/09\/thoughts-on-ai-2027\/\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">admirers<\/a> and detractors. The US vice-president, JD Vance, appeared to <a href=\"https:\/\/www.nytimes.com\/2025\/05\/21\/opinion\/jd-vance-pope-trump-immigration.html\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">reference<\/a> AI 2027 in an interview last May when discussing the US\u2019s artificial intelligence arms race with China. Gary Marcus, an emeritus professor of neuroscience at New York University, <a href=\"https:\/\/garymarcus.substack.com\/p\/the-ai-2027-scenario-how-realistic\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">called<\/a> the piece a \u201cwork of fiction\u201d and various of its conclusions \u201cpure science fiction mumbo jumbo\u201d.<\/p>\n<p>\u2018Our timelines \u2026 are a bit longer still,\u2019 Daniel Kokotajlo wrote.  Photograph: Twitter\/X<\/p>\n<p class=\"dcr-130mj7b\">Timelines for transformative artificial intelligence \u2013 sometimes called AGI (artificial general intelligence), or AI capable of replacing humans at most cognitive tasks \u2013 have become a fixture in communities devoted to AI safety. The release of ChatGPT in 2022 vastly accelerated these timelines, with <a href=\"https:\/\/www.dwarkesh.com\/p\/paul-christiano\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">officials<\/a> and <a href=\"https:\/\/yoshuabengio.org\/2023\/06\/24\/faq-on-catastrophic-ai-risks\/\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">experts<\/a> predicting the arrival of AGI within decades or years.<\/p>\n<p class=\"dcr-130mj7b\">Kokotajlo and his team named 2027 as the year AI would achieve \u201cfully autonomous coding\u201d although they said that this was a \u201cmost likely\u201d guess and some among them had longer timelines. Now, some doubts appear to be surfacing about the imminence of AGI, and whether the term is meaningful in the first place.<\/p>\n<p class=\"dcr-130mj7b\">\u201cA lot of other people have been pushing their timelines further out in the past year, as they realise how jagged AI performance is,\u201d said Malcolm Murray, an AI risk management expert and one of the authors of the International AI Safety Report.<\/p>\n<p class=\"dcr-130mj7b\">\u201cFor a scenario like AI 2027 to happen, [AI] would need a lot of more practical skills that are useful in real-world complexities. I think people are starting to realise the enormous inertia in the real world that will delay complete societal change.\u201d<\/p>\n<p class=\"dcr-130mj7b\">\u201cThe term AGI made sense from far away, when AI systems were very narrow \u2013 playing chess, and playing Go,\u201d said Henry Papadatos, the executive director of the French AI nonprofit SaferAI. \u201cNow we have systems that are quite general already and the term does not mean as much.\u201d<\/p>\n<p class=\"dcr-130mj7b\">Kokotajlo\u2019s AI 2027 relies on the idea that AI agents will fully automate coding and AI R&amp;D by 2027, leading to an \u201cintelligence explosion\u201d in which AI agents create smarter and smarter versions of themselves, and then \u2013 in one possible ending \u2013 kill all humans by mid-2030 in order to make room for more solar panels and datacentres.<\/p>\n<p class=\"dcr-130mj7b\">However, in their update, Kokotajlo and his co-authors revise their expectations for when AI might be able to code autonomously, putting this as likely to happen in the early 2030s, as opposed to 2027. The new forecast sets 2034 as the new horizon for \u201csuperintelligence\u201d and does not contain a guess for when AI may destroy humanity.<\/p>\n<p class=\"dcr-130mj7b\">\u201cThings seem to be going somewhat slower than the AI 2027 scenario. Our timelines were longer than 2027 when we published and now they are a bit longer still,\u201d <a href=\"https:\/\/x.com\/DKokotajlo\/status\/1991564542103662729\" data-link-name=\"in body link\" rel=\"nofollow\">wrote<\/a> Kokotajlo in a post on X.<\/p>\n<p class=\"dcr-130mj7b\">Creating AIs that can do AI research is still firmly an aim of leading AI companies. The OpenAI CEO, Sam Altman, <a href=\"https:\/\/x.com\/sama\/status\/1983584366547829073\" data-link-name=\"in body link\" rel=\"nofollow\">said<\/a> in October that having an automated AI researcher by March 2028 was an \u201cinternal goal\u201d of his company, but added: \u201cWe may totally fail at this goal.\u201d<\/p>\n<p class=\"dcr-130mj7b\">Andrea Castagna, a Brussels-based AI policy researcher, said there were a number of complexities that dramatic AGI timelines do not address. \u201cThe fact that you have a superintelligent computer focused on military activity doesn\u2019t mean you can integrate it into the strategic documents we have compiled for the last 20 years.<\/p>\n<p class=\"dcr-130mj7b\">\u201cThe more we develop AI, the more we see that the world is not science fiction. The world is a lot more complicated than that.\u201d<\/p>\n","protected":false},"excerpt":{"rendered":"A leading artificial intelligence expert has rolled back his timeline for AI doom, saying it will take longer&hellip;\n","protected":false},"author":2,"featured_media":394881,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[256,254,255,64,63,105],"class_list":{"0":"post-394880","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-au","12":"tag-australia","13":"tag-technology"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts\/394880","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/comments?post=394880"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts\/394880\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/media\/394881"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/media?parent=394880"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/categories?post=394880"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/tags?post=394880"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}