{"id":520964,"date":"2026-03-07T18:53:10","date_gmt":"2026-03-07T18:53:10","guid":{"rendered":"https:\/\/www.newsbeep.com\/ca\/520964\/"},"modified":"2026-03-07T18:53:10","modified_gmt":"2026-03-07T18:53:10","slug":"when-you-should-and-shouldnt-use-chatgpt-as-a-therapist-from-experts","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/ca\/520964\/","title":{"rendered":"When you should\u2014and shouldn&#8217;t\u2014use ChatGPT as a therapist, from experts"},"content":{"rendered":"<p>As Americans <a href=\"https:\/\/news.gallup.com\/poll\/651881\/daily-loneliness-afflicts-one-five.aspx\" target=\"_blank\" rel=\"nofollow noopener\">get lonelier and lonelier<\/a>, a growing number of people are getting some emotional support from artificial intelligence chatbots \u2014 and some mental health experts are concerned.<\/p>\n<p>&#8220;The topic of AI for therapy [and] emotional support companionship is coming up a lot,&#8221; says <a href=\"https:\/\/www.apa.org\/practice\/technology-innovation-measuring-care\" target=\"_blank\" rel=\"nofollow noopener\">Leanna Fortunato<\/a>, a licensed clinical psychologist and director of quality and health care innovation for the American Psychological Association. &#8220;Anecdotally, providers are talking about it, and we know from the research that people are using AI tools for that kind of support more and more.&#8221;<\/p>\n<p>Some chatbot users accidentally fall into mental health-related conversations \u2014 by complaining about a stressful day to a digital entity that&#8217;s guaranteed to listen, for example. Others may seek mental health advice from an AI chatbot <a href=\"https:\/\/www.statnews.com\/2025\/11\/15\/chatbot-therapist-characterai-mental-health-guardrails\/\" target=\"_blank\" rel=\"nofollow noopener\">that isn&#8217;t a licensed professional<\/a>, but is <a href=\"https:\/\/www.kff.org\/affordable-care-act\/kff-survey-of-consumer-experiences-with-health-insurance\/\" target=\"_blank\" rel=\"nofollow noopener\">less expensive than a therapist<\/a>, Fortunato says.<\/p>\n<p>In a health research <a href=\"https:\/\/jamanetwork.com\/journals\/jamanetworkopen\/fullarticle\/2844128\" target=\"_blank\" rel=\"nofollow noopener\">survey <\/a>of more than 20,000 U.S. adults, 10.3% of participants said they used generative AI daily. Of that group, 87.1% of them reported using the tech for personal reasons including advice and emotional support. The study was published on Jan. 21 and conducted by researchers from institutions including Massachusetts General Hospital, Weill Cornell Medicine and Northeastern University.<\/p>\n<p>On TikTok, the search term &#8220;<a href=\"https:\/\/www.tiktok.com\/discover\/therapy-ai-bot\" target=\"_blank\" rel=\"nofollow noopener\">Therapy AI Bot<\/a>&#8221; has at least 11.5 million posts, ranging from users sharing their best prompts for turning chatbots into therapists to health experts warning about the potential dangers involved.<\/p>\n<p>Technology companies are <a href=\"https:\/\/www.nytimes.com\/2026\/01\/29\/technology\/openai-in-talks-to-raise-as-much-as-100-billion.html\" target=\"_blank\" rel=\"nofollow noopener\">spending billions of dollars<\/a> developing AI tools and attempting to further integrate them into people&#8217;s daily lives. But historically, AI chatbots <a href=\"https:\/\/www.pbs.org\/newshour\/show\/what-to-know-about-ai-psychosis-and-the-effect-of-ai-chatbots-on-mental-health\" target=\"_blank\" rel=\"nofollow noopener\">don&#8217;t always understand<\/a> when a user is experiencing a serious health crisis, and may not always respond to them accordingly. The New York Times found &#8220;nearly 50 cases of people having mental health crises during conversations with ChatGPT,&#8221; including three deaths, in a <a href=\"https:\/\/www.nytimes.com\/2025\/11\/23\/technology\/openai-chatgpt-users-risks.html\" target=\"_blank\" rel=\"nofollow noopener\">Nov. 23 report<\/a>.<\/p>\n<p>Companies like <a href=\"https:\/\/www.anthropic.com\/news\/protecting-well-being-of-users\" target=\"_blank\" rel=\"nofollow noopener\">Anthropic<\/a>, <a href=\"https:\/\/blog.google\/company-news\/outreach-and-initiatives\/public-policy\/gavalas-lawsuit-response\/\" target=\"_blank\" rel=\"nofollow noopener\">Google<\/a> and ChatGPT-maker <a href=\"https:\/\/openai.com\/index\/strengthening-chatgpt-responses-in-sensitive-conversations\/\" target=\"_blank\" rel=\"nofollow noopener\">OpenAI<\/a> say they&#8217;re working with mental health experts to strengthen their tools&#8217; responses to sensitive conversations. &#8220;These are incredibly heartbreaking situations and our thoughts are with all those impacted,&#8221; an OpenAI spokesperson tells CNBC Make It.\u00a0&#8220;We continue to improve ChatGPT&#8217;s training to recognize and respond\u00a0to signs of distress, de-escalate conversations in sensitive moments, and guide people toward real-world support, working closely with mental health clinicians and experts.&#8221;<\/p>\n<p>Frequent conversations with AI companions can erode people&#8217;s real-life social skills, according to <a href=\"https:\/\/link.springer.com\/article\/10.1007\/s00146-025-02318-6\" target=\"_blank\" rel=\"nofollow noopener\">an April 2025 paper<\/a> written by an OpenAI product policy researcher. Heavy daily use of ChatGPT is correlated with increased loneliness, found <a href=\"https:\/\/arxiv.org\/abs\/2504.03888\" target=\"_blank\" rel=\"nofollow noopener\">an OpenAI-MIT Media Lab study<\/a> also published in April 2025.<\/p>\n<p>The American Psychological Association <a href=\"https:\/\/www.apa.org\/topics\/artificial-intelligence-machine-learning\/health-advisory-chatbots-wellness-apps\" target=\"_blank\" rel=\"nofollow noopener\">strongly advises against<\/a> using AI as a substitute for therapy and mental health support.<\/p>\n<p>Some mental health professionals say you can still engage with chatbots risk-free about certain related topics. Here&#8217;s what you need to know.<\/p>\n<p><a id=\"headline0\"\/>&#8216;I see it as a tool, and I think that a tool can be helpful&#8217;<\/p>\n<p>AI chatbots can be useful for learning about mental health, says psychotherapist and lifestyle coach <a href=\"https:\/\/www.eternalwellnesscounseling.com\/about\" target=\"_blank\" rel=\"nofollow noopener\">Esin Pinarli<\/a>. They can help you generate journaling prompts for reflection, and you can ask them for links to research papers about coping strategies, treatment options and other questions you may have about mental health conditions, she says.<\/p>\n<p>&#8220;I don&#8217;t see it as [a substitute for] therapy. I see it as a tool, and I think that a tool can be helpful,&#8221; says Pinarli, the founder of Boca Raton, Florida-based private practice Eternal Wellness Counseling. Her clients sometimes talk to ChatGPT about specific situations in their personal lives, and then run its responses past her before acting on them, she says.<\/p>\n<p>In her personal AI testing, Pinarli has seen chatbots sometimes use language that supports a user&#8217;s &#8220;unhealthy behaviors,&#8221; she says. If you ask a chatbot about a confrontation you had with a friend, it might tell you that your friend is being too sensitive, for example \u2014 even if you&#8217;re actually the one in the wrong.<\/p>\n<p>If an exchange with an AI chatbot touches on your mental health, Fortunato recommends asking yourself:<\/p>\n<p>Is there a reputable source that I can cross-check this information with?Do I have a provider that I can ask these questions to?<\/p>\n<p>Reputable sources could include peer-reviewed scientific studies, articles from health news organizations or resources from medical organizations like Harvard Health Publishing or the Mayo Clinic. &#8220;AI could really increase people&#8217;s access to health information,&#8221; Fortunato says. &#8220;[But] AI isn&#8217;t necessarily going to always give you correct information.&#8221;<\/p>\n<p><a id=\"headline1\"\/>Keep these considerations in mind when using AI<\/p>\n<p>Pinarli and Fortunato agree that people shouldn&#8217;t use AI chatbots for getting a diagnosis or support in a mental health crisis, especially suicidal ideation. During an active mental health crisis, you can always call or text the Suicide and Crisis Lifeline (988), which is confidential and available 24 hours a day, seven days a week, free of cost.<\/p>\n<p>&#8220;We&#8217;ve seen <a href=\"https:\/\/www.npr.org\/sections\/shots-health-news\/2025\/09\/19\/nx-s1-5545749\/ai-chatbots-safety-openai-meta-characterai-teens-suicide\" target=\"_blank\" rel=\"nofollow noopener\">some really high-profile harms<\/a>, particularly for youth or vulnerable groups who might be in crisis, where AI didn&#8217;t handle the situation correctly,&#8221; Fortunato says. &#8220;It continued to engage with people who were in crisis. It didn&#8217;t provide crisis resources. It didn&#8217;t challenge a pattern of thinking that was problematic.&#8221;<\/p>\n<p>They also both say that you shouldn&#8217;t share your medical records or any personal identifying information with a chatbot, because those conversations aren&#8217;t confidential or legally protected. And you generally shouldn&#8217;t rely on AI to solve problems in your real-life human relationships, says Pinarli.<\/p>\n<p>&#8220;You need another person with another nervous system across from you in order to pay attention to body language, to tone of voice,&#8221; she says. Chatbots are &#8220;not going to challenge you emotionally, and they don&#8217;t require reciprocity.&#8221;<\/p>\n<p>If you&#8217;re experiencing a mental health crisis or concerning mental health symptoms, you can contact the free, confidential National Helpline for Mental Health at\u00a01-800-662-HELP (4357).<\/p>\n<p>Want to improve your communication, confidence and success at work? Take CNBC&#8217;s new online course, <a href=\"https:\/\/smarter.cnbcmakeit.com\/p\/body-language-to-boost-your-influence?utm_source=cnbc&amp;utm_medium=makeitarticle&amp;utm_campaign=bottom\" target=\"_blank\" rel=\"nofollow noopener\">Master Your Body Language To Boost Your Influence<\/a>.<\/p>\n<p>Take control of your money with CNBC Select <\/p>\n<p>CNBC Select is editorially independent and may earn a commission from affiliate partners on links.<\/p>\n<p><img decoding=\"async\" class=\"InlineVideo-styles-makeit-videoThumbnail--koCZV\" src=\"https:\/\/www.newsbeep.com\/ca\/wp-content\/uploads\/2026\/03\/108269200-260205_mi_10_unl_iowa_firehouse_dirty00_08_31_00Still005.jpg\" alt=\"We bought a $90K fire station and turned it into our dream home\"\/><script async src=\"\/\/www.tiktok.com\/embed.js\"><\/script><\/p>\n","protected":false},"excerpt":{"rendered":"As Americans get lonelier and lonelier, a growing number of people are getting some emotional support from artificial&hellip;\n","protected":false},"author":2,"featured_media":520965,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[34],"tags":[26743,276,49,48,26817,26816,26818,19216,84,392,205507,393,15850,13627,1236,205506,15927,772,14938],"class_list":{"0":"post-520964","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-healthcare","8":"tag-alphabet-class-a","9":"tag-artificial-intelligence","10":"tag-ca","11":"tag-canada","12":"tag-cigna-group","13":"tag-comcast-corp","14":"tag-cvs-health-corp","15":"tag-elevance-health-inc","16":"tag-health","17":"tag-healthcare","18":"tag-lifestance-health-group-inc","19":"tag-mental-health","20":"tag-meta-platforms-inc","21":"tag-microsoft-corp","22":"tag-social-issues","23":"tag-talkspace-inc","24":"tag-teladoc-health-inc","25":"tag-united-states","26":"tag-unitedhealth-group-inc"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/posts\/520964","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/comments?post=520964"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/posts\/520964\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/media\/520965"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/media?parent=520964"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/categories?post=520964"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/tags?post=520964"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}