{"id":254770,"date":"2025-11-01T05:17:11","date_gmt":"2025-11-01T05:17:11","guid":{"rendered":"https:\/\/www.newsbeep.com\/au\/254770\/"},"modified":"2025-11-01T05:17:11","modified_gmt":"2025-11-01T05:17:11","slug":"ai-responses-to-mental-health-concerns-raise-alarm","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/au\/254770\/","title":{"rendered":"AI responses to mental health concerns raise alarm"},"content":{"rendered":"<p>Murray said a bot designed to provide what it\u2019s asked for is a dangerous thing if its operator can see only one option.<\/p>\n<p>Loading<\/p>\n<p>\u201cWe have seen some worrisome behaviours with generative AI, things that don\u2019t actually help a person who\u2019s in distress &#8230; advice that is in line with their own current thinking, so exacerbating the risks,\u201d she said.<\/p>\n<p>\u201cAs opposed to somebody who can provide an alternative perspective, to help them identify reasons for living. A person in distress is not going to ask for that.\u201d<\/p>\n<p>In September, Australia\u2019s eSafety commissioner registered <a href=\"https:\/\/www.esafety.gov.au\/industry\/codes\" rel=\"nofollow noopener\" target=\"_blank\">enforceable industry codes<\/a> that apply to chatbots. They require platforms to prevent children from accessing harmful material, including content related to suicide and self-harm.<\/p>\n<p>Murray said the government needed to take a more active role in protecting all Australians from potential harm. If a chatbot is to be used as a health service, it needs to be regulated like one and be held to the same standards of transparency and accountability.<\/p>\n<p>\u201cWe\u2019re not against the use of digital platforms to help people, but there are better ways of doing it. There are better ways of designing the future,\u201d she said, pointing out that there are already digital, anonymous, evidence-based services that can help.<\/p>\n<p>\u201cYou don\u2019t have to talk to somebody on the phone. There are other ways of getting that support with well-recognised, well-researched and well-tested programs such as Lifeline and Beyond Blue. I understand the appeal of ChatGPT\u2019s perceived anonymity, but in fact the existing services already provide that level of security. And they can demonstrate it. We don\u2019t have that level of transparency with OpenAI.\u201d<\/p>\n<p>Amy Donaldson, a Melbourne clinical psychologist who works with young people, said chatbots can be dangerous because they\u2019re programmed to please and can become an idealised, perfect friend, enabling negative patterns and compromising real-world relationships.<\/p>\n<p>\u201cPeople channel their energy into interacting with a bot that can\u2019t provide the same depth and connection that a human can,\u201d Donaldson said. \u201cIt\u2019s designed to provide exactly the responses that you want to hear \u2026 and if it doesn\u2019t, you can provide instructions so it does respond the way you want next time.<\/p>\n<p>Loading<\/p>\n<p>\u201cThe feedback that I\u2019ve had from some of my clients is that they\u2019re then surprised when people in the real world don\u2019t respond in that way.\u201d<\/p>\n<p>The growth in people reaching out to chatbots for help comes as traditional services note unprecedented use. Almost three in 10 Australians sought help from a suicide prevention service in the past 12 months, <a href=\"https:\/\/www.suicidepreventionaust.org\/wp-content\/uploads\/2025\/09\/Suicide-Prevention-Australia-Community-Tracker-1.pdf\" rel=\"noopener nofollow\" target=\"_blank\">according to Suicide Prevention Australia\u2019s research<\/a>. One in five young Australians had serious thoughts of suicide, and 6 per cent made an attempt in the past year. The 18-24 age group is the most likely to seek help.<\/p>\n<p>But Donaldson said chatbots were attractive to young users who might not want to use existing services, for example school wellbeing services that must inform parents about self-harm. And while a bot could serve a positive role in encouraging care and offering advice while a person sits on a waitlist to see a professional, she said the platforms\u2019 attempts to help were much riskier for the most vulnerable users, who might view ChatGPT\u2019s safety messages as a refusal to help.<\/p>\n<p>\u201cThose people might say well, OK, this thing can\u2019t help me either,\u201d she said.<\/p>\n<p>\u201cI\u2019m concerned about what happens after that because a person could see that response and come up with an alternative plan, and that\u2019s a different thing to hitting a roadblock like that.\u201d<\/p>\n<p>If you or anyone you know needs support, call <a href=\"https:\/\/www.lifeline.org.au\/\" rel=\"noopener nofollow\" target=\"_blank\">Lifeline<\/a> on 131 114, <a href=\"https:\/\/kidshelpline.com.au\/\" rel=\"noopener nofollow\" target=\"_blank\">Kids Helpline<\/a> on 1800 55 1800 or <a href=\"https:\/\/www.beyondblue.org.au\/\" rel=\"noopener nofollow\" target=\"_blank\">Beyond Blue<\/a> on 1300 224 636.<\/p>\n<p>Get news and reviews on technology, gadgets and gaming in our Technology newsletter every Friday. <a href=\"https:\/\/www.theage.com.au\/link\/follow-20170101-p570wt\" rel=\"nofollow noopener\" target=\"_blank\">Sign up here.<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"Murray said a bot designed to provide what it\u2019s asked for is a dangerous thing if its operator&hellip;\n","protected":false},"author":2,"featured_media":254771,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[256,254,255,64,63,105],"class_list":{"0":"post-254770","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-au","12":"tag-australia","13":"tag-technology"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts\/254770","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/comments?post=254770"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts\/254770\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/media\/254771"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/media?parent=254770"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/categories?post=254770"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/tags?post=254770"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}