{"id":93579,"date":"2025-08-25T01:54:08","date_gmt":"2025-08-25T01:54:08","guid":{"rendered":"https:\/\/www.newsbeep.com\/au\/93579\/"},"modified":"2025-08-25T01:54:08","modified_gmt":"2025-08-25T01:54:08","slug":"can-you-replace-your-therapist-with-an-ai-chatbot","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/au\/93579\/","title":{"rendered":"Can you replace your therapist with an AI chatbot? \u00a0"},"content":{"rendered":"\n<p>An increasing number of people are using artificial intelligence (AI) chatbots such as ChatGPT for mental health and support. There are both benefits and risks associated with the use of AI for mental health. However, there are no policies or guidelines to inform the safe use of AI chatbots for mental health purposes.<\/p>\n<p>Over <a href=\"https:\/\/www.aihw.gov.au\/mental-health\/overview\/prevalence-and-impact-of-mental-illness\" rel=\"nofollow noopener\" target=\"_blank\">4.3\u00a0million people<\/a> (or 1 in 5 Australians) experienced a mental health condition in the previous 12\u00a0months. Access to mental health care is hindered by high costs, long wait times, and a shortage of trained specialists.<\/p>\n<p>An increasing number of people are turning to generative artificial intelligence (AI) tools such as chatbots for mental health support, advice and companionship. Examples of general-purpose AI chatbots \u2014 which were not designed for mental health purposes but are increasingly being used in this way \u2014 include ChatGPT, Claude, and Replika. While health care practitioners are aware of \u201cDoctor Google\u201d, they may not be aware that their patients are turning to AI chatbots for mental health information, clinical advice, and real-time support.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"683\" src=\"https:\/\/www.newsbeep.com\/au\/wp-content\/uploads\/2025\/08\/shutterstock_2274546309-1024x683.jpg\" alt=\"Can you replace your therapist with an AI chatbot? \u00a0 - Featured Image\" class=\"wp-image-79113\"  \/>General purpose AI chatbots, such as ChatGPT, were not designed to provide mental health advice and support (Diego Thomazini \/ Shutterstock).<\/p>\n<p>The potential benefits of AI chatbots<\/p>\n<p>Unlike purpose-built AI chatbots for mental health (eg, <a href=\"https:\/\/mhealth.jmir.org\/2018\/11\/e12106\" rel=\"nofollow noopener\" target=\"_blank\">Wysa<\/a>, <a href=\"https:\/\/www.sciencedirect.com\/science\/article\/pii\/S2214782923000374\" rel=\"nofollow noopener\" target=\"_blank\">Woebot<\/a>, <a href=\"https:\/\/ai.nejm.org\/doi\/full\/10.1056\/AIoa2400802\" rel=\"nofollow noopener\" target=\"_blank\">Therabot)<\/a>, which draw on evidence-based psychological therapies and include in-built safeguards, general purpose AI chatbots, such as ChatGPT, were not designed to provide mental health advice and support.<\/p>\n<p>However, general AI chatbots that are being <a href=\"https:\/\/mental.jmir.org\/2024\/1\/e60589\" rel=\"nofollow noopener\" target=\"_blank\">used for mental health support in Australia<\/a> may have advantages. They are easily accessible, and provide free, immediate, 24\/7 access to information and mental health support. These tools can help individuals learn about mental health conditions, identify new coping strategies, build self-awareness, manage stress, and discover fresh perspectives on challenging situations. Because of their conversational tone, and ability to simulate empathetic and supportive responses, end-users report feeling <a href=\"https:\/\/www.nature.com\/articles\/s44184-024-00097-4\" rel=\"nofollow noopener\" target=\"_blank\">heard and understood,<\/a> and people may find themselves asking AI chatbots questions that they feel too embarrassed to ask a health professional.<\/p>\n<p>While there are benefits, there are also real, serious, and potentially devastating risks for people who use unregulated AI chatbots for mental health support. Chatbots are <a href=\"https:\/\/www.scientificamerican.com\/article\/chatbot-hallucinations-inevitable\/\" rel=\"nofollow noopener\" target=\"_blank\">known to \u201challucinate\u201d<\/a>, and may provide information that is incorrect, misleading, or <a href=\"https:\/\/pubmed.ncbi.nlm.nih.gov\/40195448\/\" rel=\"nofollow noopener\" target=\"_blank\">biased<\/a>. This information may sound plausible, but it lacks understanding of context, relevant details, or fact checking.<\/p>\n<p>A person in distress may be unable to critically evaluate the information they are provided by a chatbot without appropriate oversight by a trained professional. This is particularly true when the information sounds convincing and reinforces what a person wants to hear.<\/p>\n<p>Potential risks associated with AI chatbots<\/p>\n<p>We do not know how many people have been exposed to harmful advice from chatbots because adverse events are not routinely monitored or reported. It should be noted that even AI tools designed for mental health purposes may have the potential to cause harm. For example, the chatbot \u201cTessa\u201d was removed by the American National Eating Disorders Association (NEDA) after it was found to <a href=\"https:\/\/www.independent.co.uk\/tech\/ai-eating-disorder-harmful-advice-b2349499.html\" rel=\"nofollow noopener\" target=\"_blank\">give \u201charmful advice\u201d to people with eating disorders<\/a>. General purpose chatbots such as ChatGPT are widely used for multiple purposes and tend to be <a href=\"https:\/\/arstechnica.com\/ai\/2025\/06\/ai-chatbots-tell-users-what-they-want-to-hear-and-thats-problematic\/\" rel=\"nofollow noopener\" target=\"_blank\">enthusiastic, effusive and agreeable<\/a>. They generally provide answers without critique (unless prompted). There is a high risk that if someone experiencing delusional beliefs turns to an AI chatbot, they may receive information that reinforces rather than challenges their views. Likewise, an AI chatbot may <a href=\"https:\/\/dl.acm.org\/doi\/full\/10.1145\/3706598.3713429\" rel=\"nofollow noopener\" target=\"_blank\">endorse harmful behaviours related to suicidality or drug use. <\/a>This is because AI chatbots rely on input provided by a user, but do not have the ability to gauge whether this information is reliable or true.<\/p>\n<p>What has been termed <a href=\"https:\/\/www.psychologytoday.com\/us\/blog\/dancing-with-the-devil\/202506\/how-emotional-manipulation-causes-chatgpt-psychosis?msockid=0eaf0f183ee9627421c61a343f786339\" rel=\"nofollow noopener\" target=\"_blank\">\u201cChatGPT induced psychosis\u201d<\/a> can send people spiralling into psychological breakdowns, exposing them to <a href=\"https:\/\/www.rollingstone.com\/culture\/culture-features\/chatgpt-obsession-mental-breaktown-alex-taylor-suicide-1235368941\/\" rel=\"nofollow noopener\" target=\"_blank\">further risks<\/a>. It may be comforting for a person to receive empathetic feedback from a chatbot, but what they may need is careful and thorough assessment by a qualified health professional, appropriate guidance, and access to evidence-based treatment. Reliance on AI for mental health support may delay a person seeking professional help and accessing effective treatments that lead to recovery.<\/p>\n<p>Of specific concern is the lack of appropriate oversight or safeguards for people who may be at risk of harm to themselves and others. AI chatbots were not designed to manage crisis situations and may provide <a href=\"https:\/\/arxiv.org\/pdf\/2504.18412\" rel=\"nofollow noopener\" target=\"_blank\">inappropriate responses<\/a>. Chatbots are not subjected to the same rigorous professional standards (eg, ethical guidelines) as health professionals. Mental health data contains information that is private, personal, and potentially stigmatising, and there are <a href=\"https:\/\/arxiv.org\/pdf\/2402.09716\" rel=\"nofollow noopener\" target=\"_blank\">privacy risks<\/a> associated with the use of chatbots. Most of these tools were not designed in Australia, are not regulated, and there is no accountability if things go wrong.<\/p>\n<p>Looking forward<\/p>\n<p>Given the balance of risks and benefits with the use of AI chatbots for mental health, we encourage cautious optimism and more rigorous oversight. These tools have the potential to facilitate access to mental health information, provide practical advice, and <a href=\"https:\/\/pubmed.ncbi.nlm.nih.gov\/38317020\/\" rel=\"nofollow noopener\" target=\"_blank\">encourage people to access mental health treatment<\/a>. However, we need this information and advice to be reliable and valid, and we need this access to be secure.<\/p>\n<p>Conclusion<\/p>\n<p>The AI landscape is rapidly evolving, with chatbots becoming increasingly <a href=\"https:\/\/www.nature.com\/articles\/s41598-024-55949-y\" rel=\"nofollow noopener\" target=\"_blank\">sophisticated and human-like<\/a>. There is an urgent need for education (AI literacy) for medical professionals and the wider community; as well as policy, regulation, guidelines, and safeguards to limit the likelihood of harm. Medical professionals should be aware that their patients may be using AI chatbots to support their mental health. However, at present, AI chatbots cannot replace trained mental health professionals and do have the potential to cause serious harm.<\/p>\n<p>Dr Anthony Joffe is a clinical psychologist and a post-doctoral research fellow at the Black Dog Institute.<\/p>\n<p>Professor Jill Newby is a professor, NHMRC emerging leader, and clinical psychologist at the Black Dog Institute and UNSW Sydney. She is also co-director of the NHMRC <a href=\"https:\/\/www.blackdoginstitute.org.au\/cre-precision\/\" rel=\"nofollow noopener\" target=\"_blank\">Centre of Research Excellence in Depression Treatment Precision<\/a>.<\/p>\n<p>Dr Kylie Maidment is a post-doctoral fellow in policy research and translation at the Black Dog Institute and UNSW Sydney.<\/p>\n<p>The statements or opinions expressed in this article reflect the views of the authors and do not necessarily represent the official policy of the AMA, the\u00a0MJA\u00a0or\u00a0InSight+\u00a0unless so stated.\u00a0<\/p>\n<p>Subscribe to the free\u00a0InSight+\u00a0weekly newsletter\u00a0<a rel=\"noreferrer noopener nofollow\" href=\"https:\/\/insightplus.mja.com.au\/subscription\/\" target=\"_blank\">here<\/a>. It is available to all readers, not just registered medical practitioners.\u00a0<\/p>\n<p>If you would like to submit an article for consideration, send a Word version to\u00a0<a href=\"https:\/\/insightplus.mja.com.au\/2025\/33\/can-you-replace-your-therapist-with-an-ai-chatbot\/mailto:mjainsight-editor@ampco.com.au\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">mjainsight-editor@ampco.com.au<\/a>.\u00a0<\/p>\n","protected":false},"excerpt":{"rendered":"An increasing number of people are using artificial intelligence (AI) chatbots such as ChatGPT for mental health and&hellip;\n","protected":false},"author":2,"featured_media":93580,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[256,254,255,64,63,105],"class_list":{"0":"post-93579","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-au","12":"tag-australia","13":"tag-technology"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts\/93579","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/comments?post=93579"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts\/93579\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/media\/93580"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/media?parent=93579"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/categories?post=93579"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/tags?post=93579"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}