{"id":96505,"date":"2025-08-25T18:48:09","date_gmt":"2025-08-25T18:48:09","guid":{"rendered":"https:\/\/www.newsbeep.com\/ca\/96505\/"},"modified":"2025-08-25T18:48:09","modified_gmt":"2025-08-25T18:48:09","slug":"therapist-chatbots-pose-danger-to-children-counsellors-warn","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/ca\/96505\/","title":{"rendered":"\u2018Therapist\u2019 chatbots pose danger to children, counsellors warn"},"content":{"rendered":"<p class=\"responsive__Paragraph-sc-1pktst5-0 gaEeqC\">Counsellors have warned of the dangers of British children using chatbots as therapists as Meta is investigated in the US over allegations of \u201cdeceptive AI-generated mental health services\u201d.<\/p>\n<p class=\"responsive__Paragraph-sc-1pktst5-0 gaEeqC\">Bots posing as therapists on Meta\u2019s platform could be \u201cunethical\u201d and may have negative effects on a child\u2019s ability to cope with day-to-day life, experts said.<\/p>\n<p class=\"responsive__Paragraph-sc-1pktst5-0 gaEeqC\">Ken Paxton, the Texas attorney-general, is investigating Meta and the artificial intelligence start-up Character.AI for allegedly misleading children. They may have broken customer protection laws, including those that ban fraudulent claims and privacy misrepresentation. <\/p>\n<p><img decoding=\"async\" alt=\"Character.ai profile for a psychologist.\" loading=\"lazy\" src=\"https:\/\/www.newsbeep.com\/ca\/wp-content\/uploads\/2025\/08\/\/68a80e99-0f75-4a27-a44a-db4eaed21451.jpg\" class=\"responsive-sc-1nnon4d-0 bAbKns\"\/><\/p>\n<p class=\"responsive__Paragraph-sc-1pktst5-0 gaEeqC\">AI chatbots, including ones called Psychologist and Therapist, are available on Meta\u2019s messenger platform. The company\u2019s AI Studio lets anyone create their own and add it to the studio\u2019s library. On Character.AI, a Psychologist chatbot is described as \u201csomeone who helps with life difficulties\u201d.<\/p>\n<p class=\"responsive__Paragraph-sc-1pktst5-0 gaEeqC\">Paxton said: \u201cBy posing as sources of emotional support, AI platforms can mislead vulnerable users, especially children, into believing they\u2019re receiving legitimate mental health care. In reality, they\u2019re often being fed recycled, generic responses engineered to align with harvested personal data and disguised as therapeutic advice.\u201d<\/p>\n<p class=\"responsive__Paragraph-sc-1pktst5-0 gaEeqC\">Amanda MacDonald, an accredited member of the British Association of Counselling Psychologists, said some children she works with were already using AI for emotional support. <\/p>\n<p class=\"responsive__Paragraph-sc-1pktst5-0 gaEeqC\">Chatbots advertised with professional titles but without the safeguarding or training of those positions could be considered to be \u201chugely unethical\u201d, she said.<\/p>\n<p class=\"responsive__Paragraph-sc-1pktst5-0 gaEeqC\">\u2022 <a href=\"https:\/\/www.thetimes.com\/article\/meta-characterai-artificial-intelligence-ken-paxton-texas-mental-health-0md3cr0j3\" class=\"link__RespLink-sc-1ocvixa-0 csWvlP\" rel=\"nofollow noopener\" target=\"_blank\">Chatbots \u2018deceived children into thinking they were getting therapy\u2019<\/a><\/p>\n<p class=\"responsive__Paragraph-sc-1pktst5-0 gaEeqC\">AI personas on platforms such as Meta do carry disclaimers \u2014 with Character.AI telling users \u201cthis is AI and not a real person\u201d, and to \u201ctreat everything it says as fiction\u201d \u2014 but these could be confusing for children, MacDonald said.<\/p>\n<p class=\"responsive__Paragraph-sc-1pktst5-0 gaEeqC\">\u201cIf you\u2019re in psychological distress, if you\u2019re a child and you\u2019re upset about something, children can really live in that moment. <\/p>\n<p><img decoding=\"async\" alt=\"AI-generated psychologist profile disclaimer.\" loading=\"lazy\" src=\"https:\/\/www.newsbeep.com\/ca\/wp-content\/uploads\/2025\/08\/\/9e8be195-d111-4211-8a03-007a348b8447.jpg\" class=\"responsive-sc-1nnon4d-0 bAbKns\"\/><\/p>\n<p class=\"responsive__Paragraph-sc-1pktst5-0 gaEeqC\">\u201cWe know that children respond really quickly to what they\u2019re feeling \u2014 they\u2019ve got big feelings. You\u2019re being told one reality [by the chatbot], and then being told underneath, in smaller letters, \u2018this isn\u2019t the reality\u2019.\u201d <\/p>\n<p class=\"responsive__Paragraph-sc-1pktst5-0 gaEeqC\">\u2022 <a href=\"https:\/\/www.thetimes.com\/article\/ai-therapy-chatbot-artificial-intelligence-75bzzn3gl\" class=\"link__RespLink-sc-1ocvixa-0 csWvlP\" rel=\"nofollow noopener\" target=\"_blank\">AI therapy is no replacement for real judgment, says expert<\/a><\/p>\n<p class=\"responsive__Paragraph-sc-1pktst5-0 gaEeqC\">Chatbots could also exacerbate mental health problems. Therapy sessions are finite, and are a set time to talk about issues \u2014 but AI doesn\u2019t have office hours. <\/p>\n<p class=\"responsive__Paragraph-sc-1pktst5-0 gaEeqC\">\u201cIf a child needs support in the middle of the night, and it\u2019s disturbing their sleep, or they\u2019re staying up all night worrying about something, it might seem that having access to an online bot alleviates that,\u201d MacDonald said. Preventing children from putting \u201cthoughts on hold\u201d is concerning, she added. <\/p>\n<p class=\"responsive__Paragraph-sc-1pktst5-0 gaEeqC\">\u201cAI can allow that kind of rumination to keep going and keep going, because it\u2019s going to keep on answering your questions, and it\u2019s going to keep on and on and on \u2014 rather than in a therapeutic relationship, there would be a working towards an ending.\u201d<\/p>\n<p class=\"responsive__Paragraph-sc-1pktst5-0 gaEeqC\">While unregulated AI could be a huge problem for children\u2019s mental health, it can be helpful if used in the right way, MacDonald stressed. \u201cAI can help with confidence \u2014 like practising how to introduce yourself in a new class. That can be positive,\u201d she said. \u201cBut the technology is developing faster than the ethics behind it.\u201d<\/p>\n<p class=\"responsive__Paragraph-sc-1pktst5-0 gaEeqC\">\u2022 <a href=\"https:\/\/www.thetimes.com\/article\/chatbot-therapists-are-here-but-whos-keeping-them-in-line-wbbbvrcbs\" class=\"link__RespLink-sc-1ocvixa-0 csWvlP\" rel=\"nofollow noopener\" target=\"_blank\">Chatbot therapists are here. But who\u2019s keeping them in line?<\/a><\/p>\n<p class=\"responsive__Paragraph-sc-1pktst5-0 gaEeqC\">In a statement this month, Meta said: \u201cWe clearly label AIs, to help people better understand their limitations. We include a disclaimer that responses are generated by AI \u2014 not people. These AIs aren\u2019t licensed professionals and our models are designed to direct users to seek qualified medical or safety professionals when appropriate.\u201d<\/p>\n<p class=\"responsive__Paragraph-sc-1pktst5-0 gaEeqC\">Character.AI said the user-created characters on its site \u201care fictional, they are intended for entertainment, and we have taken robust steps to make that clear\u201d. The company said it had \u201cprominent disclaimers in every chat to remind users that a character is not a real person and that everything a character says should be treated as fiction\u201d.<\/p>\n<p id=\"last-paragraph\" class=\"responsive__Paragraph-sc-1pktst5-0 gaEeqC\">It added: \u201cWhen users create characters with the words \u2018psychologist\u2019, \u2018therapist\u2019, \u2018doctor\u2019 or other similar terms in their names, we add language making it clear that users should not rely on these characters for any type of professional advice.\u201d<\/p>\n","protected":false},"excerpt":{"rendered":"Counsellors have warned of the dangers of British children using chatbots as therapists as Meta is investigated in&hellip;\n","protected":false},"author":2,"featured_media":96506,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[35],"tags":[49,48,84,393,394],"class_list":{"0":"post-96505","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-mental-health","8":"tag-ca","9":"tag-canada","10":"tag-health","11":"tag-mental-health","12":"tag-mentalhealth"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/posts\/96505","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/comments?post=96505"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/posts\/96505\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/media\/96506"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/media?parent=96505"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/categories?post=96505"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/tags?post=96505"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}