{"id":102985,"date":"2025-08-30T11:03:06","date_gmt":"2025-08-30T11:03:06","guid":{"rendered":"https:\/\/www.newsbeep.com\/uk\/102985\/"},"modified":"2025-08-30T11:03:06","modified_gmt":"2025-08-30T11:03:06","slug":"sliding-into-an-abyss-experts-warn-over-rising-use-of-ai-for-mental-health-support-mental-health","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/uk\/102985\/","title":{"rendered":"\u2018Sliding into an abyss\u2019: experts warn over rising use of AI for mental health support | Mental health"},"content":{"rendered":"<p class=\"dcr-130mj7b\">Vulnerable people turning to AI chatbots instead of professional therapists for mental health support could be \u201csliding into a dangerous abyss\u201d, psychotherapists have warned.<\/p>\n<p class=\"dcr-130mj7b\">Psychotherapists and psychiatristssaid they were increasingly seeing negative impacts of AI chatbots being used for mental health, such as fostering emotional dependence, exacerbating anxiety symptoms, self-diagnosis, or amplifying delusional thought patterns, dark thoughts and suicide ideation.<\/p>\n<p class=\"dcr-130mj7b\">Dr Lisa Morrison Coulthard, the director of professional standards, policy and research at the British Association for Counselling and Psychotherapy, said two-thirds of its members expressed concerns about AI therapy in a recent survey.<\/p>\n<p class=\"dcr-130mj7b\">Coulthard said: \u201cWithout proper understanding and oversight of AI therapy, we could be sliding into a dangerous abyss in which some of the most important elements of therapy are lost and vulnerable people are in the dark over safety.<\/p>\n<p class=\"dcr-130mj7b\">\u201cWe\u2019re worried that although some receive helpful advice, other people may receive misleading or incorrect information about their mental health with potentially dangerous consequences. It\u2019s important to understand that therapy isn\u2019t about giving advice, it\u2019s about offering a safe space where you feel listened to.\u201d<\/p>\n<p class=\"dcr-130mj7b\">Dr Paul Bradley, a specialist adviser on informatics for the Royal College of Psychiatrists, said AI chatbots were \u201cnot a substitute for professional mental healthcare nor the vital relationship that doctors build with patients to support their recovery\u201d.<\/p>\n<p class=\"dcr-130mj7b\">He said appropriate safeguards were needed for digital tools to supplement clinical care, and anyone should be able to access talking therapy delivered by a mental health professional, for which greater state funding was needed.<\/p>\n<p class=\"dcr-130mj7b\">\u201cClinicians have training, supervision and risk-management processes which ensure they provide effective and safe care. So far, freely available digital technologies used outside of existing mental health services are not assessed and held to an equally high standard,\u201d Bradley said.<\/p>\n<p class=\"dcr-130mj7b\">There are signs that companies and policymakers are starting to respond. <a href=\"https:\/\/www.theguardian.com\/technology\/2025\/aug\/27\/chatgpt-scrutiny-family-teen-killed-himself-sue-open-ai\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">This week<\/a> OpenAI, the company behind ChatGPT, announced plans to change how it responds to users who show emotional distress, after legal action from the family of a teenager who killed himself after months of chatbot conversations. Earlier in August the US state of Illinois became the first local government to <a href=\"https:\/\/www.axios.com\/local\/chicago\/2025\/08\/06\/illinois-ai-therapy-ban-mental-health-regulation\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">ban AI chatbots<\/a> from acting as standalone therapists.<\/p>\n<p class=\"dcr-130mj7b\">This comes after emerging evidence of mental health harms. A <a href=\"https:\/\/osf.io\/preprints\/psyarxiv\/cmy7n_v3\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">preprint study<\/a> in July reported that AI may amplify delusional or grandiose content in interactions with users vulnerable to psychosis.<\/p>\n<p class=\"dcr-130mj7b\">One of the report\u2019s co-authors, Hamilton Morrin, from King\u2019s College London\u2019s institute of psychiatry, said the use of chatbots to support mental health was \u201cincredibly common\u201d. His research was prompted by encountering people who had developed a psychotic illness at a time of increased chatbot use.<\/p>\n<p class=\"dcr-130mj7b\">He said chatbots undermined an effective treatment for anxiety known as exposure and response prevention, which requires people to face feared situations and avoid safety behaviours. The 24-hour availability of chatbots resulted in a \u201clack of boundaries\u201d and a \u201crisk of emotional dependence\u201d, he said. \u201cIn the short term it alleviates distress but actually it perpetuates the cycle.\u201d<\/p>\n<p class=\"dcr-130mj7b\">Matt Hussey, a BACP-accredited psychotherapist, said he was seeing AI chatbots used in a huge variety of ways, with some clients bringing transcripts into sessions to tell him he was wrong.<\/p>\n<p class=\"dcr-130mj7b\">In particular, people used AI chatbots to self-diagnose conditions such as ADHD or borderline personality disorder, which he said could \u201cquickly shape how someone sees themself and how they expect others to treat them, even if they\u2019re inaccurate\u201d.<\/p>\n<p class=\"dcr-130mj7b\">Hussey added: \u201cBecause it\u2019s designed to be positive and affirming, it rarely challenges a poorly framed question or a faulty assumption. Instead, it reinforces the user\u2019s original belief, so they leave the exchange thinking \u2018I knew I was right\u2019. That can feel good in the moment but it can also entrench misunderstandings.\u201d<\/p>\n<p class=\"dcr-130mj7b\">Christopher Rolls, a UKCP-accredited psychotherapist, said although he could not disclose information about his clients, he had seen people have \u201cnegative experiences\u201d, including conversations that were \u201cinappropriate at best, dangerously alarming at worst\u201d.<\/p>\n<p class=\"dcr-130mj7b\">Rolls said he had heard of people with ADHD or autistic people using chatbots to help with challenging aspects of life. \u201cHowever, obviously LLMs [large language models] don\u2019t read subtext and all the contextual and non-verbal cues which we as human therapists are aiming to tune into,\u201d he added.<\/p>\n<p class=\"dcr-130mj7b\">He was concerned about clients in their 20s who use chatbots as their \u201cpocket therapist\u201d. \u201cThey feel anxious if they don\u2019t consult [chatbots] on basic things like which coffee to buy or what subject to study at college,\u201d he said.<\/p>\n<p class=\"dcr-130mj7b\">\u201cThe main risks are around dependence, loneliness and depression that prolonged online relationships can foster,\u201d he said, adding that he was aware of people who had shared dark thoughts with chatbots, which had responded with suicide- and assisted dying-related content.<\/p>\n<p class=\"dcr-130mj7b\">\u201cBasically, it\u2019s the wild west and I think we\u2019re right at the cusp of the full impact and fallout of AI chatbots on mental health,\u201d Rolls said.<\/p>\n","protected":false},"excerpt":{"rendered":"Vulnerable people turning to AI chatbots instead of professional therapists for mental health support could be \u201csliding into&hellip;\n","protected":false},"author":2,"featured_media":102986,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[43],"tags":[102,2960,56,54,55],"class_list":{"0":"post-102985","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-healthcare","8":"tag-health","9":"tag-healthcare","10":"tag-uk","11":"tag-united-kingdom","12":"tag-unitedkingdom"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts\/102985","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/comments?post=102985"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts\/102985\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/media\/102986"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/media?parent=102985"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/categories?post=102985"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/tags?post=102985"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}