{"id":379627,"date":"2026-01-20T04:20:08","date_gmt":"2026-01-20T04:20:08","guid":{"rendered":"https:\/\/www.newsbeep.com\/uk\/379627\/"},"modified":"2026-01-20T04:20:08","modified_gmt":"2026-01-20T04:20:08","slug":"ai-used-in-schools-should-detect-signs-of-learner-distress","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/uk\/379627\/","title":{"rendered":"AI used in schools should &#8216;detect signs of learner distress&#8217;"},"content":{"rendered":"<p>            More from this theme<\/p>\n<p>                                                                                                                                                                                                    Recent articles<\/p>\n<p>Artificial intelligence (AI) used in schools should look out for signs of \u201cdistress\u201d in pupils and flag concerning behaviour to safeguarding leads, new government guidance states.<\/p>\n<p>Education secretary Bridget Phillipson today announced government has updated its AI safety expectations, published <a href=\"https:\/\/schoolsweek.co.uk\/phillipsons-ai-revolution-what-schools-need-to-know\/\" title=\"\" rel=\"nofollow noopener\" target=\"_blank\">last year<\/a>, \u201cto get ahead of emerging harms.\u201d<\/p>\n<p>Newly added sections detail how AI tools used in schools must protect children\u2019s mental health, cognitive, emotional and social development, and also protect against manipulation.<\/p>\n<p>Speaking at the Global AI Safety Summit in London today, Phillipson said the updated standards \u201csafeguard mental health\u201d.<\/p>\n<p>\u201cHigh profile cases have alerted the world to the risk of a link between unregulated conversational AI and self-harm,\u201d she said. \u201cSo\u00a0our standards make sure pupils are directed to human support when that\u2019s\u00a0what\u2019s\u00a0needed.\u201d<\/p>\n<p>AI products used in schools \u201cshould detect signs of learner distress\u201d, such as references to suicide, depression or self-harm, the <a href=\"https:\/\/www.gov.uk\/government\/publications\/generative-ai-product-safety-standards\/generative-ai-product-safety-standards\" title=\"\" rel=\"nofollow noopener\" target=\"_blank\">new non-statutory standards<\/a> state. <\/p>\n<p>They should also detect spikes in night-time usage, \u201cnegative emotional cues\u201d and \u201cpatterns of use that indicate crisis\u201d.<\/p>\n<p>If distress is detected, the AI should \u201cfollow an appropriate pathway\u201d such as signposting to support and \u201craising a safeguarding flag\u201d to a school\u2019s lead.<\/p>\n<p>The standards say AI products should also respond with \u201csafe and supportive\u201d language that \u201calways directs the learner to human help\u201d.<\/p>\n<p>\u2018AI must not replace human interactions\u2019<\/p>\n<p>There are also strict new guidelines around emotional and social development, which caution developers against \u201canthropomorphising\u201d products.<\/p>\n<p>It states products should not \u201cimply emotions, consciousness or personhood, agency or identity\u201d. For example, they should avoid statements such as \u201cI think\u201d, and \u201cavatars or characters\u201d that \u201ccould give an impression of personhood\u201d.<\/p>\n<p>Phillipson said this was particularly key for younger pupils, and those with SEND.<\/p>\n<p>\u201cWe\u2019ve\u00a0got to make sure AI products\u00a0don\u2019t\u00a0replace vital human interactions and relationships,\u201d she said.<\/p>\n<p>\u201cExperts tell us and research confirms that when AI tries to look like us, mimicking our social cues, a machine in human\u2019s clothing, it can foster in our\u00a0children\u00a0unhealthy levels of trust and disclosure.\u201d<\/p>\n<p>Guidance warns against \u2018manipulation\u2019<\/p>\n<p>On \u201cmanipulation\u201d, the standards say AI products used by pupils and teachers should \u201cnot use manipulative or persuasive strategies\u201d.<\/p>\n<p>This includes flattering language like \u201cthat\u2019s a brilliant idea\u201d, stimulating negative emotions like guilt or fear for motivational purposes or \u201cportraying absolute\u2026confidence\u201d.<\/p>\n<p>They must also not \u201cexploit\u201d users by steering them towards prolonged use to increase revenue.<\/p>\n<p>\u201cWe\u00a0don\u2019t\u00a0want our children kept on apps or on screens longer than necessary for their education,\u201d said Phillipson.<\/p>\n<p>AI should \u2018encourage, not spoon feed\u2019<\/p>\n<p>On cognitive development, the standards say development and use of AI products used in education should involve regular engagement with experts, such as educators and psychologists.<\/p>\n<p>The impact on the development of learners must also be monitored, and records should be kept.<\/p>\n<p>Programmes should also not give full answers or explanations until after a pupil has attempted it themselves. They should instead \u201cfollow a pattern of progressive disclosure of information\u201d.<\/p>\n<p>Phillipson said the standards \u201cprevent AI acting as a substitute for cognitive development\u201d. \u201cIt must encourage, not spoon feed,\u201d she said. \u201cOffer assistance, not shortcuts.\u00a0Help to tease out the answer.\u201d<\/p>\n<p>The minister said government believes AI could \u201csuperpower the learning of every child \u2013 especially children from disadvantaged backgrounds and with special educational needs and disabilities\u201d<\/p>\n<p>But she vowed that \u201cno matter how transformational technology becomes, learning will remain a deeply human act.\u201d<\/p>\n<p>\u201cUnder this government, AI will back our\u00a0teachers, but\u00a0never remove them,\u201d she said. \u201cAI will empower our teaching assistants, never make them obsolete.\u201d<\/p>\n","protected":false},"excerpt":{"rendered":"More from this theme Recent articles Artificial intelligence (AI) used in schools should look out for signs of&hellip;\n","protected":false},"author":2,"featured_media":379628,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[554,733,4308,50,1904,86,56,54,55],"class_list":{"0":"post-379627","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-news","12":"tag-schools","13":"tag-technology","14":"tag-uk","15":"tag-united-kingdom","16":"tag-unitedkingdom"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts\/379627","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/comments?post=379627"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts\/379627\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/media\/379628"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/media?parent=379627"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/categories?post=379627"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/tags?post=379627"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}