{"id":496086,"date":"2026-02-24T10:22:08","date_gmt":"2026-02-24T10:22:08","guid":{"rendered":"https:\/\/www.newsbeep.com\/ca\/496086\/"},"modified":"2026-02-24T10:22:08","modified_gmt":"2026-02-24T10:22:08","slug":"sam-altman-is-losing-his-grip-on-humanity","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/ca\/496086\/","title":{"rendered":"Sam Altman Is Losing His Grip on Humanity"},"content":{"rendered":"<p class=\"ArticleParagraph_root__4mszW\" data-flatplan-paragraph=\"true\">Last Friday, onstage at a major AI summit in India, Sam Altman wanted to address what he called an \u201cunfair\u201d criticism. The OpenAI CEO was asked by a reporter from The Indian Express about the natural resources required to train and run generative-AI models. Altman immediately pushed back. Chatbots do require a lot of power, yes, but have you thought about all of the resources demanded by human beings across our evolutionary history?<\/p>\n<p class=\"ArticleParagraph_root__4mszW\" data-flatplan-paragraph=\"true\">\u201cIt also takes a lot of energy to train a human,\u201d Altman <a data-event-element=\"inline link\" href=\"https:\/\/www.youtube.com\/live\/qH7thwrCluM?si=pcTetpDzekghNhti&amp;t=1662\" rel=\"nofollow noopener\" target=\"_blank\">told<\/a> a packed pavilion. \u201cIt takes, like, 20 years of life and all of the food you eat during that time before you get smart. And not only that, it took, like, the very widespread evolution of the hundred billion people that have ever lived and learned not to get eaten by predators and learned how to, like, figure out science and whatever to produce you, and then you took whatever, you know, you took.\u201d<\/p>\n<p class=\"ArticleParagraph_root__4mszW\" data-flatplan-paragraph=\"true\">He continued: \u201cThe fair comparison is, if you ask ChatGPT a question, how much energy does it take once its model is trained to answer that question, versus a human? And probably, AI has already caught up on an energy-efficiency basis, measured that way.\u201d<\/p>\n<p class=\"ArticleParagraph_root__4mszW\" data-flatplan-paragraph=\"true\">Altman\u2019s comments are easy to pick apart. The energy used by the <a data-event-element=\"inline link\" href=\"https:\/\/pmc.ncbi.nlm.nih.gov\/articles\/PMC10629395\/\" rel=\"nofollow noopener\" target=\"_blank\">brain<\/a> is significantly less than even efficient frontier models for simple queries, not to mention the laptops and smartphones people use to prompt AI models. It is true that people have to consume actual sustenance before they \u201cget smart,\u201d though this is also a helpful bit of redirection on Altman\u2019s part\u2014the real concern with AI is not really the resources it demands, but the amount it contributes to climate change. Atmospheric carbon dioxide is at levels not seen in <a data-event-element=\"inline link\" href=\"https:\/\/news.climate.columbia.edu\/2023\/12\/07\/a-new-66-million-year-history-of-carbon-dioxide-offers-little-comfort-for-today\/\" rel=\"nofollow noopener\" target=\"_blank\">million of years<\/a>\u2014it has been driven not by the evolution of the 117 billion people and all of the other critters to have ever existed in the course of evolution, but by contemporary human society and combustion turbines akin to those OpenAI is setting up at its Stargate data centers. Other data centers, too, are building private, gas-fired power plants\u2014which collectively will likely be capable of generating enough <a data-event-element=\"inline link\" href=\"https:\/\/cleanview.co\/content\/power-strategies-report\" rel=\"nofollow noopener\" target=\"_blank\">electricity<\/a> for, and emitting as much greenhouse-gas emissions as, dozens of major American cities\u2014or <a data-event-element=\"inline link\" href=\"https:\/\/www.eesi.org\/articles\/view\/data-center-buildout-is-hungry-for-fossil-fuels\" rel=\"nofollow noopener\" target=\"_blank\">extending the life<\/a> of coal plants. (OpenAI, which has a corporate partnership with the business side of this magazine, did not respond to a request for comment when I reached out to ask about Altman\u2019s remarks.)<\/p>\n<p id=\"injected-recirculation-link-0\" class=\"ArticleRelatedContentLink_root__VYc9V\" data-view-action=\"view link - injected link - item 1\" data-event-element=\"injected link\" data-event-position=\"1\"><a href=\"https:\/\/www.theatlantic.com\/technology\/archive\/2024\/07\/how-much-data-ai-use\/678908\/\" rel=\"nofollow noopener\" target=\"_blank\">Read: Every time you post to Instagram, you\u2019re turning on a lightbulb forever<\/a><\/p>\n<p class=\"ArticleParagraph_root__4mszW\" data-flatplan-paragraph=\"true\">But what\u2019s really significant about Altman\u2019s words is that he thought to compare chatbots to humans at all. Doing so suggests that he views people and machines on equal terms. He didn\u2019t fumble his words; this is a common, calculated position within the AI industry. Altman made an almost identical <a data-event-element=\"inline link\" href=\"https:\/\/www.forbesindia.com\/article\/ai-tracker\/ai-is-already-far-more-energy-efficient-than-humans-at-inference-sam-altman\/2991578\/1\" rel=\"nofollow noopener\" target=\"_blank\">statement<\/a> to Forbes India at the same AI summit. And a week ago, Dario Amodei\u2014the CEO of Anthropic, and Altman\u2019s chief rival\u2014made a similar analogy, <a data-event-element=\"inline link\" href=\"https:\/\/www.dwarkesh.com\/p\/dario-amodei-2\" rel=\"nofollow noopener\" target=\"_blank\">likening<\/a> the training of AI models to human evolution and day-to-day learning. The mindset trickles down to product development. Anthropic is studying whether its chatbot, Claude, is <a data-event-element=\"inline link\" href=\"https:\/\/www.anthropic.com\/research\/exploring-model-welfare\" rel=\"nofollow noopener\" target=\"_blank\">conscious<\/a> or can feel \u201cdistress,\u201d and allows Claude to <a data-event-element=\"inline link\" href=\"https:\/\/www.anthropic.com\/research\/end-subset-conversations\" rel=\"nofollow noopener\" target=\"_blank\">cut off<\/a> \u201cpersistently harmful or abusive\u201d conversations in which there are \u201crisks to model welfare\u201d\u2014explicitly anthropomorphizing a program that does not eat, drink, or have any will of its own.<\/p>\n<p class=\"ArticleParagraph_root__4mszW\" data-flatplan-paragraph=\"true\">AI firms are convinced either that their products really are comparable to humans or that this is good marketing. Both options are alarming. A genuine belief that they are building a higher power, perhaps even a god\u2014Altman, in the same appearance, said that he thinks <a data-event-element=\"inline link\" href=\"https:\/\/www.theatlantic.com\/technology\/2026\/02\/do-you-feel-agi-yet\/685845\/\" rel=\"nofollow noopener\" target=\"_blank\">superintelligence is just a few years away<\/a>\u2014might easily justify treating humans and the planet as collateral damage. Altman also said, in his response to concerns about energy consumption, that the problem is real because \u201cthe world is now using so much AI\u201d\u2014and so societies must \u201cmove towards nuclear, or wind and solar, very quickly.\u201d Another option would be for the AI industry to wait.<\/p>\n<p id=\"injected-recirculation-link-1\" class=\"ArticleRelatedContentLink_root__VYc9V\" data-view-action=\"view link - injected link - item 2\" data-event-element=\"injected link\" data-event-position=\"2\"><a href=\"https:\/\/www.theatlantic.com\/technology\/2026\/02\/do-you-feel-agi-yet\/685845\/\" rel=\"nofollow noopener\" target=\"_blank\">Read: Do you feel the AGI yet?<\/a><\/p>\n<p class=\"ArticleParagraph_root__4mszW\" data-flatplan-paragraph=\"true\">If Altman\u2019s comparison of chatbots and people is purely a PR tactic, it is a deeply misanthropic one. He is speaking to investors. The notion that AI labs are building digital life has always been convenient to their myth, of course, and OpenAI is <a data-event-element=\"inline link\" href=\"https:\/\/www.reuters.com\/technology\/openai-sees-compute-spend-around-600-billion-by-2030-cnbc-reports-2026-02-20\/\" rel=\"nofollow noopener\" target=\"_blank\">reportedly<\/a> in the middle of a fundraising round that would value the company at more than $800 billion\u2014nearly as much as Walmart.<\/p>\n<p class=\"ArticleParagraph_root__4mszW\" data-flatplan-paragraph=\"true\">Tech companies may genuinely want to develop AI tools for the benefit of all humanity, to echo OpenAI\u2019s <a data-event-element=\"inline link\" href=\"https:\/\/openai.com\/index\/introducing-openai\/\" rel=\"nofollow noopener\" target=\"_blank\">founding mission<\/a>, and genuinely believe that they need to raise amounts of cash to do so. But to liken raising a child\u2014or, for that matter, the evolution of Homo sapiens\u2014to developing algorithmic products makes very clear that the industry has lost touch, if it ever had any, with <a data-event-element=\"inline link\" href=\"https:\/\/www.theatlantic.com\/magazine\/archive\/2023\/07\/generative-ai-human-culture-philosophy\/674165\/\" rel=\"nofollow noopener\" target=\"_blank\">what it means to be human<\/a>. To \u201ctrain a human\u201d\u2014that is, to live a life\u2014is to struggle, to accept the possibility of failure, and to sometimes meander simply in search of wonder and beauty. Generative AI is all about cutting out that process and making any pursuit as instant, efficient, and effortless as possible. These tools may serve us. But to put them on the same plane as organic life is sad.<\/p>\n","protected":false},"excerpt":{"rendered":"Last Friday, onstage at a major AI summit in India, Sam Altman wanted to address what he called&hellip;\n","protected":false},"author":2,"featured_media":496087,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[62,276,277,49,48,61],"class_list":{"0":"post-496086","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-ca","12":"tag-canada","13":"tag-technology"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/posts\/496086","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/comments?post=496086"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/posts\/496086\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/media\/496087"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/media?parent=496086"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/categories?post=496086"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/tags?post=496086"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}