{"id":476650,"date":"2026-03-15T08:54:07","date_gmt":"2026-03-15T08:54:07","guid":{"rendered":"https:\/\/www.newsbeep.com\/uk\/476650\/"},"modified":"2026-03-15T08:54:07","modified_gmt":"2026-03-15T08:54:07","slug":"one-word-could-change-how-you-think-of-chatgpt","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/uk\/476650\/","title":{"rendered":"One word could change how you think of ChatGPT"},"content":{"rendered":"<p>Recently, I\u2019ve been saying something terrifying about <a href=\"https:\/\/www.independent.co.uk\/topic\/claude\" rel=\"nofollow noopener\" target=\"_blank\">Claude<\/a>: \u201che\u201d. <a href=\"https:\/\/www.independent.co.uk\/topic\/anthropic\" rel=\"nofollow noopener\" target=\"_blank\">Anthropic<\/a>\u2019s <a href=\"https:\/\/www.independent.co.uk\/topic\/ai\" rel=\"nofollow noopener\" target=\"_blank\">AI<\/a> encourages you to do it, in the obvious way that he \u2013 sorry, it \u2013 takes a human name, but in <a rel=\"nofollow noopener\" target=\"_blank\" href=\"https:\/\/www.independent.co.uk\/tech\/ethical-ai-chatgpt-claude-anthropic-b2899636.html\">the much deeper way that it seems to have something of a personality<\/a>, in a way that the more dull and obsequious competitors such as ChatGPT don\u2019t. It dares you, as you talk to it, to anthropomorphise it.<\/p>\n<p>I probably shouldn\u2019t be so hard on myself; it\u2019s difficult not to think of these things as intelligent. Never in human history have we met anything that could talk to us like this, since words have been essentially human for as long as we\u2019ve known. But the reminder that Claude is not human is an important if terrifying one. The \u201cit\u201d is crucial. Because \u2013 and this is an obvious truth, but one we should keep repeating to ourselves \u2013 Claude is a computer.<\/p>\n<p>I don\u2019t mean this to be cruel to Claude. For nearly 100 years, computers have been better than humans at a whole range of tasks. Being a computer is not a slur, and very often is the opposite. But it gets to an important fact about these chatbots: they compute, they don\u2019t understand. They are like calculators for words. Calculators have enabled us to work things out that would have previously been beyond our comprehension. But they don\u2019t understand maths, they just do it.<\/p>\n<p>As soon as you start calling it a computer \u2013 and I have done so insistently in recent weeks \u2013 it seems to change the shape of everything in just one word. It is like an inverted version of switching from \u201che\u201d to \u201cit\u201d. \u201cI was speaking to my AI and he told me that I should quit my job\u201d is a perfectly sensible feeling sentence; \u201cthe computer told me to quit my job\u201d might even have some wisdom in it, but it is of an entirely different and more accurate kind.<\/p>\n<p>It\u2019s the second letter of AI that is the really damaging one. \u201cIntelligence\u201d is a particularly tricksy piece of marketing. In one sense, that is of course what these chatbots are: if we understand intelligence as a kind of informational handiness, then that\u2019s exactly what they are. But intelligence implies some sort of mental event, and so we are moved into the wrong sort of understanding. Intelligence is a useful metaphor, but sometimes the metaphor becomes switched up with the actual thing, and you can\u2019t see the forest for the trees.<\/p>\n<p>We used to be more careful with the words we used for these things. For years, AI experts were uncomfortable with calling what they made \u201cartificial intelligence\u201d, preferring the phrase \u201cmachine learning\u201d, since it more accurately and usefully describes the actual process, as well as avoiding both the technical imprecision and philosophical baggage that comes with talking about AI. But after ChatGPT was released, the victory of those two letters quickly became complete, and now you sound a little boring if you refuse to use it. But being interesting was the insidious point of that marketing exercise: OpenAI wants you to use the exciting words, because they want you to be excited about buying into it.<\/p>\n<p>But we should use the boring ones. Because chatbots are very big, very good computers. Very, very good computers. Only this week, I was using Claude to analyse my marathon training plan \u2013 I gave it my Strava data, going back to the first ever marathon I did in 2019, and it produced charts that compared where I was in my training at comparable points in the past, computing it in ways that I didn\u2019t think possible. But then it strained against its limits, giving me something more like emotive cheerleading, telling me that my goal was well within my grasp as long as I tried hard enough. <\/p>\n<p>A computer doesn\u2019t know anything like that. (Interestingly, the Sonnet 4.6 model that Claude was recently updated to use is remarkably self-aware about these limitations, and sometimes refuses to give me this kind of human motivation even when I ask; curiously, this humility somehow makes me more likely to understand it as something that should be called he.)<\/p>\n<p>Calling it a computer does highlight these kinds of limitations. Even though AI systems are generative, that ostensibly creative work is itself a kind of computing: all it does is guess the right words in the right order. There might be wisdom there, but it is our wisdom. The supercomputers that churn away to work out pi to unimaginable numbers of digits don\u2019t have any sense of what they are working out; when we look at them with a kind of wonder, that wonder is ours, just as the instructions that began their work were ours. Calling it a computer shifts that responsibility back onto us, in a way that can be scary but is absolutely central to responsibly using these systems.<\/p>\n<p>This is ethically important. <a rel=\"nofollow noopener\" target=\"_blank\" href=\"https:\/\/clicks.independent.co.uk\/f\/a\/LArsoKYz7RzZB1rUgXeJiQ~~\/AAAHahA~\/hb9-IbFY2WpsJWS9-LI28KpqDon98G43eWyXOvYPZVRilAZPm3ymx_mUoMTc1CnxxqsQSDPp2MKhqB1iDaLT7BV7CLvSG8RRooiv4fo3Lx6G9wyKyjDB24_p3iByM-QdbGfUbyS04He9ojyx9MH7K5__O1mLrIUZXME1cI6vpov2pNITexymQtBeBaWyJOo6DHHAwr1O2s6i9Cctyk5TZonneLqdDbpKWm3XC1Q7dEvhGK5UoypTPJPNF37t4UBhUSDrk87FXFTw3oXWF6qdKASJVHNloMbP7XBzj7XY6t58A5oZv6dcvk-2QK2WIICuHZvnHBZSmYZLgJ7OpnzZdBmszwTdSU-61F6VvOFTPK-DG0lZ1tbOnysWnDoJrdGZNRHrwm-z2HWeJ_gH3TfUXWfMC6e_EOl9H6ctu-k9c5DQ4ET5q9VBsDSWHQPXxMQDzDogxOxJ99PdhvI0mJohRQ~~\">AI systems are now being used to kill people<\/a>. There can be a quasi-religious tendency to talk about this as if it represents some new force in the world, some vast unknowable and powerful intelligence that we can\u2019t truly reckon with. But when we call it a computer, we put the responsibility back into our hands: some human started that horrible, fatal calculation, and some other human chose to use it. To allow ourselves not to call it a computer is letting people \u2013 real, human people \u2013 get away with that.<\/p>\n<p>So calling it a computer can feel a little silly, but it is really done with the most honest and anguished intentions. Calling it a computer might be the most morally serious and truthful thing we can do.<\/p>\n","protected":false},"excerpt":{"rendered":"Recently, I\u2019ve been saying something terrifying about Claude: \u201che\u201d. Anthropic\u2019s AI encourages you to do it, in the&hellip;\n","protected":false},"author":2,"featured_media":476651,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[554,733,4308,86,56,54,55],"class_list":{"0":"post-476650","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-technology","12":"tag-uk","13":"tag-united-kingdom","14":"tag-unitedkingdom"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts\/476650","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/comments?post=476650"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts\/476650\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/media\/476651"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/media?parent=476650"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/categories?post=476650"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/tags?post=476650"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}