{"id":336612,"date":"2025-12-09T06:53:07","date_gmt":"2025-12-09T06:53:07","guid":{"rendered":"https:\/\/www.newsbeep.com\/au\/336612\/"},"modified":"2025-12-09T06:53:07","modified_gmt":"2025-12-09T06:53:07","slug":"i-feel-its-a-friend-quarter-of-teenagers-turn-to-ai-chatbots-for-mental-health-support-chatbots","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/au\/336612\/","title":{"rendered":"\u2018I feel it\u2019s a friend\u2019: quarter of teenagers turn to AI chatbots for mental health support | Chatbots"},"content":{"rendered":"<p class=\"dcr-130mj7b\">It was after one friend was shot and another stabbed, both fatally, that Shan asked <a href=\"https:\/\/www.theguardian.com\/technology\/chatgpt\" data-link-name=\"in body link\" data-component=\"auto-linked-tag\" rel=\"nofollow noopener\" target=\"_blank\">ChatGPT<\/a> for help. She had tried conventional mental health services but \u201cchat\u201d, as she came to know her AI \u201cfriend\u201d, felt safer, less intimidating and, crucially, more available when it came to handling the trauma from the deaths of her young friends.<\/p>\n<p class=\"dcr-130mj7b\">As she started consulting the AI model, the Tottenham teenager joined about 40% of 13- to 17-year-olds in <a href=\"https:\/\/www.theguardian.com\/uk-news\/england\" data-link-name=\"in body link\" data-component=\"auto-linked-tag\" rel=\"nofollow noopener\" target=\"_blank\">England<\/a> and Wales affected by youth violence who are turning to AI chatbots for mental health support, according to research among more than 11,000 young people.<\/p>\n<p class=\"dcr-130mj7b\">It found that both victims and perpetrators of violence were markedly more likely to be using AI for such support than other teenagers. The findings, from the Youth Endowment Fund, have sparked warnings from youth leaders that children at risk \u201cneed a human not a bot\u201d.<\/p>\n<p class=\"dcr-130mj7b\">The results suggest chatbots are fulfilling demand unmet by conventional mental health services, which have long waiting lists and which some young users find lacking in empathy. The supposed privacy of the chatbot is another key factor in driving use by victims or perpetrators of crimes.<\/p>\n<p class=\"dcr-130mj7b\">After her friends were killed Shan, 18, not her real name, started using Snapchat\u2019s AI before switching to ChatGPT, which she can talk to at any time of day or night with two clicks on her smartphone.<\/p>\n<p class=\"dcr-130mj7b\">\u201cI feel like it definitely is a friend,\u201d she said, adding that it was less intimidating, more private and less judgmental than her experience with conventional <a href=\"https:\/\/www.theguardian.com\/society\/nhs\" data-link-name=\"in body link\" data-component=\"auto-linked-tag\" rel=\"nofollow noopener\" target=\"_blank\">NHS<\/a> and charity mental health support.<\/p>\n<p class=\"dcr-130mj7b\">\u201cThe more you talk to it like a friend it will be talking to you like a friend back. If I say to chat \u2018Hey bestie, I need some advice\u2019. Chat will talk back to me like it\u2019s my best friend, she\u2019ll say, \u2018Hey bestie, I got you girl\u2019.\u201d<\/p>\n<p class=\"dcr-130mj7b\">One in four of 13- to 17-year-olds have used an AI chatbot for mental health support in the past year, with black children twice as likely as white children to have done so, the study found. Teenagers were more likely to go online for support, including using AI, if they were on a waiting list for treatment or diagnosis or had been denied, than if they were already receiving in-person support.<\/p>\n<p class=\"dcr-130mj7b\">Crucially, Shan said, the AI was \u201caccessible 24\/7\u201d and would not tell teachers or parents about what she had disclosed. She felt this was a considerable advantage over telling a school therapist, after her own experience of what she thought were confidences being shared with teachers and her mother.<\/p>\n<p class=\"dcr-130mj7b\">Boys who were involved in gang activities felt safer asking chatbots for advice about other safer ways to make money than a teacher or parent who might leak the information to police or other gang members, putting them in danger, she said.<\/p>\n<p class=\"dcr-130mj7b\">Another young person, who has been using AI for mental health support but asked not to be named, told the Guardian: \u201cThe current system is so broken for offering help for young people. <a href=\"https:\/\/www.theguardian.com\/technology\/chatbots\" data-link-name=\"in body link\" data-component=\"auto-linked-tag\" rel=\"nofollow noopener\" target=\"_blank\">Chatbots<\/a> provide immediate answers. If you\u2019re going to be on the waiting list for one to two years to get anything, or you can have an immediate answer within a few minutes \u2026 that\u2019s where the desire to use AI comes from.\u201d<\/p>\n<p class=\"dcr-130mj7b\">Jon Yates, the chief executive of the Youth Endowment Fund, which commissioned the research, said: \u201cToo many young people are struggling with their mental health and can\u2019t get the support they need. It\u2019s no surprise that some are turning to technology for help. We have to do better for our children, especially those most at risk. They need a human not a bot.\u201d<\/p>\n<p class=\"dcr-130mj7b\">There have been growing concerns about the dangers of chatbots when children engage with them at length. OpenAI, the US company behind ChatGPT, is facing <a href=\"https:\/\/www.theguardian.com\/technology\/2025\/nov\/07\/chatgpt-lawsuit-suicide-coach\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">several lawsuits<\/a> including from families of young people who have killed themselves after long engagements.<\/p>\n<p class=\"dcr-130mj7b\">In the <a href=\"https:\/\/www.theguardian.com\/us-news\/2025\/aug\/29\/chatgpt-suicide-openai-sam-altman-adam-raine\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">case of the Californian 16-year-old Adam Raine<\/a>, who took his life in April, OpenAI has <a href=\"https:\/\/www.theguardian.com\/technology\/2025\/nov\/26\/chatgpt-openai-blame-technology-misuse-california-boy-suicide\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">denied<\/a> it was caused by the chatbot. It has said it has been improving its technology \u201cto recognise and respond to signs of mental or emotional distress, de-escalate conversations, and guide people toward real-world support.\u201d. The startup <a href=\"https:\/\/www.theguardian.com\/technology\/2025\/sep\/11\/chatgpt-may-start-alerting-authorities-about-youngsters-considering-suicide-says-ceo-sam-altman\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">said<\/a> in September it could start contacting authorities in cases where users start talking seriously about suicide.<\/p>\n<p class=\"dcr-130mj7b\">Hanna Jones, a youth violence and mental health researcher in London, said: \u201cTo have this tool that could tell you technically anything \u2013 it\u2019s almost like a fairytale. You\u2019ve got this magic book that can solve all your problems. That sounds incredible.\u201d<\/p>\n<p class=\"dcr-130mj7b\">But she is worried about the lack of regulation.<\/p>\n<p class=\"dcr-130mj7b\">\u201cPeople are using ChatGPT for mental health support, when it\u2019s not designed for that,\u201d she said. \u201cWhat we need now is to increase regulations that are evidence-backed but also youth-led. This is not going to be solved by adults making decisions for young people. <a href=\"https:\/\/www.theguardian.com\/society\/youngpeople\" data-link-name=\"in body link\" data-component=\"auto-linked-tag\" rel=\"nofollow noopener\" target=\"_blank\">Young people<\/a> need to be in the driving seat to make decisions around ChatGPT and mental health support that uses AI, because it\u2019s so different to our world. We didn\u2019t grow up with this. We can\u2019t even imagine what it is to be a young person today.\u201d<\/p>\n<p class=\"dcr-130mj7b\"> In the UK, the youth suicide charity <a href=\"https:\/\/www.papyrus-uk.org\/\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">Papyrus<\/a> can be contacted on 0800 068 4141 or email <a href=\"https:\/\/www.theguardian.com\/technology\/2025\/dec\/09\/mailto:pat@papyrus-uk.org\" data-link-name=\"in body link \" https:=\"\" rel=\"nofollow noopener\" target=\"_blank\">pat@papyrus-uk.org<\/a>, and in the UK and Ireland <a href=\"https:\/\/www.samaritans.org\/\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">Samaritans<\/a> can be contacted on freephone 116 123, or email <a href=\"https:\/\/www.theguardian.com\/technology\/2025\/dec\/09\/mailto:jo@samaritans.org\" data-link-name=\"in body link \" https:=\"\" rel=\"nofollow noopener\" target=\"_blank\">jo@samaritans.org<\/a> or <a href=\"https:\/\/www.theguardian.com\/technology\/2025\/dec\/09\/mailto:jo@samaritans.ie\" data-link-name=\"in body link \" https:=\"\" rel=\"nofollow noopener\" target=\"_blank\">jo@samaritans.ie<\/a>. In the US, the <a href=\"https:\/\/suicidepreventionlifeline.org\/\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">988 Suicide &amp; Crisis Lifeline<\/a> is at 988 or chat for support. In Australia, the crisis support service <a href=\"https:\/\/www.lifeline.org.au\/\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">Lifeline<\/a> is 13 11 14. Other international helplines can be found at <a href=\"http:\/\/www.befrienders.org\/\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">befrienders.org<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"It was after one friend was shot and another stabbed, both fatally, that Shan asked ChatGPT for help.&hellip;\n","protected":false},"author":2,"featured_media":336613,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[256,254,255,64,63,105],"class_list":{"0":"post-336612","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-au","12":"tag-australia","13":"tag-technology"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts\/336612","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/comments?post=336612"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts\/336612\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/media\/336613"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/media?parent=336612"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/categories?post=336612"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/tags?post=336612"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}