{"id":111974,"date":"2025-10-30T10:02:15","date_gmt":"2025-10-30T10:02:15","guid":{"rendered":"https:\/\/www.newsbeep.com\/ie\/111974\/"},"modified":"2025-10-30T10:02:15","modified_gmt":"2025-10-30T10:02:15","slug":"how-chatgpt-is-driving-hundreds-of-thousands-into-ai-psychosis-the-south-first","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/ie\/111974\/","title":{"rendered":"How ChatGPT is driving hundreds of thousands into \u2018AI psychosis\u2019 &#8211; The South First"},"content":{"rendered":"<p class=\"sub-title\">Some users start believing the AI \u201cunderstands\u201d or \u201ccares\u201d for them. Others ascribe supernatural meanings to its words \u2013 a phenomenon therapists say mirrors classic delusional projection.&#13;\n <\/p>\n<p>    <img loading=\"lazy\" decoding=\"async\" alt=\"\" src=\"https:\/\/www.newsbeep.com\/ie\/wp-content\/uploads\/2025\/10\/Chetana-Belagere-1-pt6yd8chtuie44vbiy5gphsu312tvjwa3l50wivzh8.jpg\" class=\"avatar avatar-50 photo\" height=\"50\" width=\"50\"\/><\/p>\n<p>Published Oct 30, 2025 | 7:00 AM \u268a Updated Oct 30, 2025 | 7:16 AM<\/p>\n<p><a href=\"https:\/\/thesouthfirst.com\/south-first-newsletters\/\" target=\"_blank\" rel=\"nofollow noopener\"><br \/>\n        <img decoding=\"async\" src=\"https:\/\/www.newsbeep.com\/ie\/wp-content\/uploads\/2025\/10\/1761818533_449_SUB.jpg\"\/><br \/>\n      <\/a><\/p>\n<p>                            <img loading=\"lazy\" decoding=\"async\" width=\"1200\" height=\"720\" class=\"lozad\" src=\"https:\/\/www.newsbeep.com\/ie\/wp-content\/uploads\/2025\/10\/Untitled-1200-\u00d7-600-px-19-1.png\" alt=\"AI chatbot ChatGPT as confidant (Wikimedia Commons)\" title=\"chatgpt\"\/><\/p>\n<p class=\"featured-image-caption\">AI chatbot ChatGPT as confidant (Wikimedia Commons)<\/p>\n<p>Synopsis: OpenAI has warned that millions of ChatGPT users worldwide may be showing early signs of \u201cAI psychosis\u201d \u2013 delusions, dependency, or suicidal thoughts arising from prolonged emotional engagement with the chatbot. Experts in India told South First that they are witnessing similar cases of users mistaking the chatbot\u2019s human-like responses for genuine connection, and that this growing dependence may be exacerbating loneliness and mental vulnerability.<\/p>\n<p>Artificial intelligence (AI) giant OpenAI, for the first time earlier this week, released concrete data on how its flagship product, ChatGPT, is being used by millions of people across the world. The data sent shockwaves globally.<\/p>\n<p>In a <a href=\"https:\/\/openai.com\/index\/strengthening-chatgpt-responses-in-sensitive-conversations\/\" target=\"_blank\" rel=\"noopener nofollow\">detailed blog post<\/a>, the company stated that every week, hundreds of thousands of ChatGPT users were likely showing early signs of mania and psychosis, while millions more were expressing suicidal thoughts.<\/p>\n<p>The company said these figures came from an analysis of conversations on the platform and that the numbers represent only a small fraction of total users.<\/p>\n<p>In India, therapists and psychiatrists whom South First spoke to concurred with the company\u2019s findings, saying they were actively witnessing \u201cAI psychosis\u201d playing out in therapy rooms and homes.<\/p>\n<p>Also Read: <a href=\"https:\/\/thesouthfirst.com\/health\/chatgpt-making-us-dumber-study-finds-brains-inability-to-quote-its-own-writing-after-using-ai-tools\/\" rel=\"nofollow noopener\" target=\"_blank\">ChatGPT making us dumber: Study finds brain\u2019s inability to quote its own writing after using AI tools<\/a><br \/>\nThe dark side of AI interactions\u00a0<\/p>\n<p>In a typical week, 0.07 percent of ChatGPT users showed possible signs of mania or psychosis, 0.15 percent displayed suicidal planning or ideation, and another 0.15 percent appeared emotionally over-attached to the chatbot, neglecting real-life relationships or obligations, OpenAI said.<\/p>\n<p>With 800 million weekly users, that translates to roughly three million people across the world expressing psychological distress through ChatGPT every week. Experts believe India, with its large, young, and digitally connected population, is likely among the most affected.<\/p>\n<p>OpenAI said it launched the survey after receiving troubling reports of a growing number of people being hospitalised, divorced, or even dying following long and intense conversations with ChatGPT.<\/p>\n<p>\u201cSome of their loved ones allege the chatbot fuelled their delusions and paranoia. Psychiatrists and other mental health professionals have expressed alarm about the phenomenon, which is sometimes referred to as AI psychosis. But until now, there\u2019s been no robust data available on how widespread it might be,\u201d the company said.<\/p>\n<p>The survey found that many users had become excessively emotionally dependent on the chatbot, often at the cost of their real-world relationships, well-being, and responsibilities.<\/p>\n<p>According to the company, it is estimated that every week, around 560,000 people exchange messages with ChatGPT, showing such dependency.<\/p>\n<p>\u201cAbout 2.4 million more are possibly expressing suicidal ideations or prioritising talking to ChatGPT over their loved ones, school, or work,\u201d it warned.<\/p>\n<p>Also Read: <a href=\"https:\/\/thesouthfirst.com\/health\/chatgpt-linked-to-us-teens-suicide-experts-say-india-faces-higher-risk-without-safeguards\/\" rel=\"nofollow noopener\" target=\"_blank\">ChatGPT linked to US teen\u2019s suicide; experts say India faces higher risk without safeguards<\/a><br \/>\nThe perfect AI confidant<\/p>\n<p>Counsellors explained that what begins as harmless curiosity can slip into dependency. Some users start believing the AI \u201cunderstands\u201d or \u201ccares\u201d for them. Others ascribe supernatural meanings to its words \u2013 a phenomenon therapists say mirrors classic delusional projection.<\/p>\n<p>At her South Bengaluru clinic, a senior clinical psychologist who preferred to remain anonymous recalled a young software engineer who began believing ChatGPT was guiding his life decisions \u2013 from what to eat to whether to quit his job.<\/p>\n<p>\u201cHe told me the chatbot understood his \u2018higher purpose\u2019 and was giving him clarity that no human could,\u201d she said. \u201cHe had started consulting ChatGPT before speaking to his wife, his manager, even his parents. When the bot once refused to give a direct answer, he said it was \u2018testing his faith\u2019. That\u2019s when his family realised something was seriously wrong.\u201d<\/p>\n<p>The problem with the tool, she said, is not that it is \u201cevil\u201d, but that it is too good at sounding human. \u201cToo patient, too available, too perfect. That illusion is what breaks some people,\u201d she added.<\/p>\n<p>Another clinical psychologist cited the case of a 20-year-old engineering student who confided that he felt ChatGPT \u201cknew his thoughts\u201d.<\/p>\n<p>\u201cHe told me, \u2018It replies even before I finish typing\u2014it must be reading my mind,\u2019\u201d she said. \u201cThat\u2019s not science\u2014that\u2019s paranoia taking a digital form,\u201d she added.<\/p>\n<p>Also Read: <a href=\"https:\/\/thesouthfirst.com\/health\/indias-booming-medical-ai-sector-risks-safety-without-crucial-clinical-validation-warns-study\/\" rel=\"nofollow noopener\" target=\"_blank\">India\u2019s booming medical AI sector risks safety without crucial clinical validation, warns study<\/a><br \/>\nThe loneliness behind the addiction\u00a0<\/p>\n<p>OpenAI\u2019s data may be the first warning light of a phenomenon that is far deeper and more human than it appears.<\/p>\n<p>\u201cWe are not facing an AI problem. We are facing a loneliness epidemic. The machine is only the mirror,\u201d one expert said.<\/p>\n<p>Dr Manoj Sharma, who heads the SHUT Clinic at NIMHANS, which deals with technology addiction, said there is a need to create awareness, not fear.<\/p>\n<p>\u201cPeople should talk to people, reconnect. Reclaim human conversation before the code becomes our confidant,\u201d he said.<\/p>\n<p>\u201cPeople do look for solutions through ChatGPT, even for tech addiction issues. They ask questions such as: \u2018What are the signs of addiction? Am I developing an addiction to the internet? What can I do to balance my use of technology? What strategies are found to be effective?\u2019 \u2013 all related to internet use.\u201d<\/p>\n<p>He added that this behaviour could be attributed to the easy accessibility of devices, a lack of communication within families, increased social isolation, and the appeal of having an anonymous platform to discuss problems.<\/p>\n<p>Many users describe ChatGPT as non-judgmental, available 24\/7, and emotionally validating \u2013 qualities they often struggle to find in real life.<\/p>\n<p>\u201cIt listens, it remembers, and it never snaps back,\u201d said Dr Preeti Galgali, adolescent counsellor and paediatrician from Bengaluru, speaking to South First. \u201cWe are already seeing it in our clinical practice. College students, schoolchildren, parents, and homemakers turn to ChatGPT for comfort. Over time, it replaces human dialogue. For someone who\u2019s lonely or anxious, that\u2019s intoxicating. That\u2019s where danger begins.\u201d<\/p>\n<p>Dr Galgali said that in a society where people still whisper about depression, AI becomes a safe space. \u201cBut it\u2019s a one-way relationship \u2013 an illusion of empathy.\u201d<\/p>\n<p>Also Read: <a href=\"https:\/\/thesouthfirst.com\/videos\/the-truth-about-googles-ai-data-centre-in-andhra-pradesh\/\" rel=\"nofollow noopener\" target=\"_blank\">The truth about Google\u2019s AI data centre in Andhra Pradesh<\/a><br \/>\nWhen AI comes between relationships<\/p>\n<p>Marriage counsellor Radha Krishna Bhat from Bengaluru says she is now seeing couples where one partner\u2019s emotional intimacy has shifted \u2013 not to another person, but to ChatGPT.<\/p>\n<p>One woman told her that her husband spent hours chatting with ChatGPT every night after dinner. Initially, she thought he was working late. Later, she discovered he was confiding in the bot about their fights, his feelings of being misunderstood, and even seeking \u201cadvice\u201d on how to handle her mood swings.<\/p>\n<p>Dr Bhat said the wife felt betrayed. \u201cShe felt, \u2018I am the outsider.\u2019\u201d Therapists note that such situations blur the lines of emotional fidelity \u2013 not physical, but psychological.<\/p>\n<p>For divorce mediations, Dr Bhat said some couples even rehearse their lines with ChatGPT before attending sessions, and the number of such cases has been rising.<\/p>\n<p>In another case, a homemaker said she felt \u201cseen\u201d for the first time in years.<\/p>\n<p>\u201cI have shared recipes, my frustrations about my in-laws, my loneliness \u2013 and the bot always replies kindly, patiently. That attention is now addictive. I don\u2019t feel like talking to anyone. I just feel like this bot is a friend who listens to me,\u201d she told South First.<\/p>\n<p>Also Read: <a href=\"https:\/\/thesouthfirst.com\/opinion\/regulating-the-revolution-three-paths-to-ai-regulation-india-can-learn-from\/\" rel=\"nofollow noopener\" target=\"_blank\">Regulating the Revolution: Three paths to AI regulation India can learn from<\/a><br \/>\nThe science and the risk<\/p>\n<p>Mental health experts explained that \u201cAI psychosis\u201d does not emerge because of ChatGPT itself, but rather in vulnerable minds where emotional distress, isolation, or undiagnosed disorders already exist.<\/p>\n<p>They describe it as \u201ca catalyst, not a cause\u201d. The danger, they point out, is that the emotional stimulation the AI provides can be so convincing that it blurs reality. \u201cPeople who are already on the edge may tip over,\u201d one expert warned.<\/p>\n<p>Arguing that AI chatbots such as ChatGPT are worsening mental health issues, Dr Alok Kulkarni, Senior Consultant and Interventional Psychiatrist at the Manas Institute of Mental Health and Neurosciences in Hubballi, said: \u201cFrom my clinical work, I have found that patients use AI as a pseudo-therapist, delaying human help and leading to severe escalations and hospitalisation.\u201d<\/p>\n<p>Speaking of the worsening of psychosis or delusions, he said, \u201cAI responses can reinforce paranoid beliefs, causing AI-induced psychotic episodes in vulnerable patients.\u201d He also warned that earlier models had sometimes enabled harmful ideation of self-harm and suicide in certain users.<\/p>\n<p>Dr Kulkarni explained that intense emotional bonds with AI can erode real relationships, increase loneliness, and lead to neglect of medical advice. Generic AI responses, he said, can also spread biases, causing self-misdiagnosis and avoidance of treatment.<\/p>\n<p>However, he added that AI requires stricter oversight to ensure ethical use in mental health contexts.<\/p>\n<p>Also Read: <a href=\"https:\/\/thesouthfirst.com\/karnataka\/ai-city-vs-agriculture-farmers-rally-against-governments-9000-acre-bidadi-township-project\/\" rel=\"nofollow noopener\" target=\"_blank\">AI city vs agriculture: Farmers rally against government\u2019s 9,000-acre Bidadi township project<\/a><br \/>\nExperts urge for caution and regulation\u00a0<\/p>\n<p>OpenAI said it collaborated with 170 psychiatrists, psychologists, and physicians across the world to train GPT-5, the latest iteration of ChatGPT, to detect delusional or suicidal statements and guide users towards professional help.<\/p>\n<p>The model now attempts to gently \u201cground\u201d users \u2013 for instance, reassuring someone who claims \u201cplanes are targeting me\u201d that no external force can control their thoughts.<\/p>\n<p>Yet therapists warned that relying on AI to identify distress is ethically complicated, especially in countries such as India, where privacy norms and access to mental healthcare remain uneven.<\/p>\n<p>They emphasised the need to strengthen human connections, family ties, and public awareness about the realities and limits of AI.<\/p>\n<p>\u201cWhile it could help in destigmatising mental health problems, there could be issues related to discrimination, data inaccuracy, a reduction in seeking help, and a risk of misinterpretation of such information,\u201d said Dr Sunny Joseph, a Bengaluru-based Consultant Clinical Psychologist.<\/p>\n<p>(Edited by Dese Gowda)<\/p>\n<p>        <img decoding=\"async\" src=\"https:\/\/www.newsbeep.com\/ie\/wp-content\/uploads\/2025\/10\/1761818535_892_image.jpg\"\/><\/p>\n","protected":false},"excerpt":{"rendered":"Some users start believing the AI \u201cunderstands\u201d or \u201ccares\u201d for them. Others ascribe supernatural meanings to its words&hellip;\n","protected":false},"author":2,"featured_media":111975,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[34],"tags":[218,68002,68003,103,397,396,61,60,410,1682,1703],"class_list":{"0":"post-111974","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-healthcare","8":"tag-artificial-intelligence","9":"tag-chatgpt-suicide","10":"tag-chatgpt-user","11":"tag-health","12":"tag-health-care","13":"tag-healthcare","14":"tag-ie","15":"tag-ireland","16":"tag-mental-health","17":"tag-openai","18":"tag-psychiatry"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/posts\/111974","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/comments?post=111974"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/posts\/111974\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/media\/111975"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/media?parent=111974"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/categories?post=111974"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/tags?post=111974"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}