{"id":280256,"date":"2025-11-24T07:51:09","date_gmt":"2025-11-24T07:51:09","guid":{"rendered":"https:\/\/www.newsbeep.com\/uk\/280256\/"},"modified":"2025-11-24T07:51:09","modified_gmt":"2025-11-24T07:51:09","slug":"ai-could-be-changing-our-brains-in-ways-we-dont-even-realise","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/uk\/280256\/","title":{"rendered":"AI could be changing our brains in ways we don\u2019t even realise"},"content":{"rendered":"<p>Everyone is cheating. Earlier this year, research showed that almost every student was relying on <a href=\"https:\/\/www.independent.co.uk\/topic\/ai\" rel=\"nofollow noopener\" target=\"_blank\">AI<\/a> tools such as <a href=\"https:\/\/www.independent.co.uk\/topic\/chatgpt\" rel=\"nofollow noopener\" target=\"_blank\">ChatGPT<\/a> for their work: 88 per cent of students polled had used it for assignments, up from 53 per cent last year. (The numbers are largely similar in the US.) Anecdotally, the vast number of people already using AI in their work becomes its own kind of justification: if everyone else is cheating, why wouldn\u2019t you? <\/p>\n<p>Because it is making you think less, and less well, the research shows, though it is still limited. Earlier this year, <a rel=\"nofollow noopener\" target=\"_blank\" href=\"https:\/\/www.media.mit.edu\/projects\/your-brain-on-chatgpt\/overview\/\">researchers divided participants<\/a> into three groups and asked them to write an essay. Some were given help from a large language model (LLM), such as ChatGPT; some were allowed access to Google; some didn\u2019t have any help at all. They then studied the three groups in a variety of ways.<\/p>\n<p>As they wrote, their brains worked differently. The more help people got, the less active parts of their brains were. Those who had been given help by AI were less good at quoting their essays. Researchers cautioned that the work is early and relatively limited \u2013 and explicitly warned against using it to suggest that people were being made more stupid \u2013 but it at the very least suggested \u201cthe pressing matter of exploring a possible decrease in learning skills\u201d from using large language models in education.<\/p>\n<p>For thousands of years, thinkers have been worried that technology could undermine memory and understanding. The first of those technologies was writing itself. In The Phaedrus, Socrates warns that the written word could undermine memory and that text might only make people seem like they have knowledge, rather than actually having it.<\/p>\n<p>Computers have only made those concerns more pressing. In a paper in 2011, researchers identified the \u201cGoogle effect\u201d, in which having information readily available at our fingertips seemed to make it less available inside our heads. Even 15 years ago, researchers were finding that people being asked to recall things were primed to think about computers, and having the expectation of that information being readily available meant they were less likely to actually remember it. \u201cThe internet has become a primary form of external or transactive memory, where information is stored collectively outside ourselves,\u201d they wrote.<\/p>\n<p>One big fear about the impact of AI on education is that it doesn\u2019t feel like AI is making us stupid: using it feels like learning. In an article published in the summer, information systems researcher <a rel=\"nofollow noopener\" target=\"_blank\" href=\"https:\/\/theconversation.com\/is-chatgpt-making-us-stupid-255370\">Aaron French noted<\/a> that talking to AI \u201ccan artificially inflate one\u2019s perceived intelligence while actually reducing cognitive effort\u201d. He pointed to the Dunning-Kruger effect \u2013 which suggests that a little knowledge is a dangerous thing, because you feel empowered with information but don\u2019t yet have enough of it to be aware of what you don\u2019t know \u2013 and warned that wrongly using AI can leave people sat in that dangerous spike of confidence, what researchers have called the \u201cpeak of Mount Stupid\u201d.<\/p>\n<p>Late last month, Anastasia Berg \u2013 who teaches philosophy at the University of California, Irvine \u2013 <a rel=\"nofollow noopener\" data-affiliate=\"true\" target=\"_blank\" href=\"http:\/\/redirect.viglink.com?u=https%3A%2F%2Fwww.nytimes.com%2F2025%2F10%2F29%2Fopinion%2Fai-students-thinking-school-reading.html&amp;articleId=b2870050&amp;key=9ed4af92937c872e0ab792f0310bab4e\">noted that<\/a> many see a divide between \u201cillicit uses of AI\u201d such as having it write a whole essay, and \u201cinnocent auxiliary functions\u201d such as helping with the outline of that essay. But, she noted, deciding what to write about is an indispensable skill. \u201cNo aspect of cognitive understanding is perfunctory,\u201d she wrote.<\/p>\n<p>Still, AI is arriving in universities, whether those running them like it or not. Earlier this year, Oxford became one of a number of universities to make an official deal with OpenAI, the creators of ChatGPT, after what it said was a \u201csuccessful year-long pilot\u201d. Students get access to a special version of ChatGPT that protects data and includes other safeguards; OpenAI gets to suggest that AI is becoming more central to learning.<\/p>\n<p>Much of the discussion around Oxford&#8217;s embrace of AI was explicitly in the context of its students having done so already: the choice isn&#8217;t between essays being written with ChatGPT or not, but about whether the university officially recognises it. &#8220;\u2018We know that significant numbers of staff and students are already using generative AI tools,&#8221; noted Anne Trefethen, the University of Oxford\u2019s Pro-Vice-Chancellor for Digital, when the project was announced. The use of AI has taken on its own force, and many academics suggest that it is better to teach students to use it well, rather than to teach them without using it.<\/p>\n<p>&#8220;University-wide access to ChatGPT Edu will support the development of rigorous academic skills and digital literacy, so that we prepare our graduates to thrive and lead by example in an AI-enabled world,&#8221; said Freya Johnston, pro-vice-chancellor for education at Oxford University. \u201cGenerative AI is also helping us to explore new ways of engaging with students, alongside our renowned face-to-face teaching and tutorial model, which emphasises critical thinking and contextual analysis.&#8221;<\/p>\n<p>Oxford&#8217;s own rules don&#8217;t rule out generative AI in research, but require that users &#8220;remain ultimately responsible for GenAI content used in research&#8221;. It tells them that they should keep &#8220;an awareness of the tools\u2019 limitations, such as hallucinations, or social biases that may be embedded in training data, which could perpetuate misrepresentation of social categories, protected groups, or historical inaccuracies&#8221; as well as requiring them to be aware of other dangers and be transparent about their use of the tools.<\/p>\n<p>Many universities have similar rules. Earlier this year, New York Magazine \u2013 in a piece headlined \u201ceveryone is cheating their way through college\u201d and which claimed that the technology has \u201cunravelled the entire academic project\u201d \u2013 <a rel=\"nofollow noopener\" target=\"_blank\" href=\"https:\/\/nymag.com\/intelligencer\/article\/openai-chatgpt-ai-cheating-education-college-students-school.html\">reported on a student<\/a> who happily flouted Columbia University\u2019s rules on not using AI without permission. Columbia too has a tie-up with OpenAI, it noted.<\/p>\n<p>In that world, students might have to learn differently \u2013 and that might include learning how to relate to artificial intelligence. Kaitlyn Regehr, an associate professor in digital humanities at University College London, has warned that the growth of artificial intelligence should bring with it a specific kind of education about \u201chow much of our thinking, or more specifically the development of our thinking, is acceptable to outsource\u201d. \u201cWhat is the responsibility to shift and to supplement through our education system, throughout parenting, in order to support young people?\u201d she asked an event earlier this year.<\/p>\n<p>That could mean a project similar to PE classes in schools. \u201cWith the advent of the car, and more sedentary vocations, a boom in research around physical health was born,\u201d she said. \u201cAnd because we were not moving, because the technology did that for us, we needed to start to artificially move.<\/p>\n<p>\u201cWe saw gym culture emerge, and PE class. Because people weren&#8217;t moving, because technology was moving for us. I think a really helpful analogy I hope for parents [&#8230;] is a gym for the AI age. A social, emotional gym. A social, emotional PE class. What do we now need to supplement, if AI is increasingly doing things for us, and children are not having to move their minds?\u201d<\/p>\n","protected":false},"excerpt":{"rendered":"Everyone is cheating. Earlier this year, research showed that almost every student was relying on AI tools such&hellip;\n","protected":false},"author":2,"featured_media":280257,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[554,733,4308,86,56,54,55],"class_list":{"0":"post-280256","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-technology","12":"tag-uk","13":"tag-united-kingdom","14":"tag-unitedkingdom"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts\/280256","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/comments?post=280256"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts\/280256\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/media\/280257"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/media?parent=280256"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/categories?post=280256"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/tags?post=280256"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}