{"id":379685,"date":"2025-12-31T16:09:12","date_gmt":"2025-12-31T16:09:12","guid":{"rendered":"https:\/\/www.newsbeep.com\/ca\/379685\/"},"modified":"2025-12-31T16:09:12","modified_gmt":"2025-12-31T16:09:12","slug":"doctors-say-ai-use-is-almost-certainly-linked-to-developing-psychosis","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/ca\/379685\/","title":{"rendered":"Doctors Say AI Use Is Almost Certainly Linked to Developing Psychosis"},"content":{"rendered":"<p>\t<img decoding=\"async\" class=\"archive-post-thumb article-featured-image w-full h-auto mb-3\" src=\"https:\/\/www.newsbeep.com\/ca\/wp-content\/uploads\/2025\/12\/doctors-link-ai-psychosis.jpg\"   fetchpriority=\"high\" width=\"2048\" height=\"1365\" alt=\"More and more doctors are agreeing that using AI chatbots is linked to the delusional, cases of psychosis.\"\/><\/p>\n<p>\t\t\tFiordaliso \/ Getty Images\n\t<\/p>\n<p class=\"pw-incontent-excluded article-paragraph skip\">There continue to be <a href=\"https:\/\/futurism.com\/chatgpt-mental-health-crises\" rel=\"nofollow noopener\" target=\"_blank\">numerous reports<\/a> of people suffering severe mental health spirals after talking extensively with an AI chatbot. Some experts <a href=\"https:\/\/www.psychologytoday.com\/us\/blog\/urban-survival\/202507\/the-emerging-problem-of-ai-psychosis\" rel=\"nofollow noreferrer noopener\" target=\"_blank\">have dubbed<\/a> the phenomenon \u201cAI psychosis,\u201d given the symptoms of psychosis these delusional episodes display \u2014 but the degree to which the AI tools are at fault, and whether the phenomenon warrants a clinical diagnosis, remains a significant topic of debate.<\/p>\n<p class=\"article-paragraph skip\">Now, according to <a href=\"https:\/\/www.wsj.com\/tech\/ai\/ai-chatbot-psychosis-link-1abf9d57\" rel=\"nofollow noreferrer noopener\" target=\"_blank\">new reporting<\/a> from The Wall Street Journal, we may be nearing a consensus. More and more doctors are agreeing that AI chatbots are linked to cases of psychosis, including top psychiatrists who reviewed the files of dozens of patients who engaged in prolonged, delusional conversations with models like OpenAI\u2019s ChatGPT.<\/p>\n<p class=\"article-paragraph skip\">Keith Sakata, a psychiatrist at the University of California, San Francisco, who has treated twelve patients who were hospitalized because of AI-induced psychosis, is one of them.<\/p>\n<p class=\"article-paragraph skip\">\u201cThe technology might not introduce the delusion, but the person tells the computer it\u2019s their reality and the computer accepts it as truth and reflects it back, so it\u2019s complicit in cycling that delusion,\u201d Sakata told the WSJ.<\/p>\n<p class=\"article-paragraph skip\">The grim trend looms large over the AI industry, raising fundamental questions about the tech\u2019s safety. Some cases of apparent AI psychosis have ended in murder and suicide, spawning a slew of <a href=\"https:\/\/www.nytimes.com\/2025\/11\/06\/technology\/chatgpt-lawsuit-suicides-delusions.html\" rel=\"nofollow noreferrer noopener\" target=\"_blank\">wrongful death suits<\/a>. Equally alarming is its scale: ChatGPT alone has been <a href=\"https:\/\/futurism.com\/artificial-intelligence\/chatgpt-deaths-panera-lemonade\" rel=\"nofollow noopener\" target=\"_blank\">linked to at least eight deaths<\/a>, with the company recently estimating that around half a million users are having conversations showing signs of AI psychosis every week.<\/p>\n<p class=\"article-paragraph skip\">One factor of AI chatbots that the phenomenon has brought under scrutiny is their sycophancy, which is perhaps a consequence of their being designed to be as engaging and humanlike as possible. What this looks like in practice is that the bots tend to flatter the users and tell them what they want to hear, even if what the user is saying has no basis in reality.\u00a0<\/p>\n<p class=\"article-paragraph skip\">It\u2019s a recipe primed for reinforcing delusions, to a degree unprecedented by any technology before it, doctors say. One <a href=\"https:\/\/innovationscns.com\/youre-not-crazy-a-case-of-new-onset-ai-associated-psychosis\/\" rel=\"nofollow noreferrer noopener\" target=\"_blank\">recent peer-reviewed case study<\/a> focused on a 26-year-old woman who was hospitalized twice after she believed ChatGPT was allowing her to talk with her dead brother, with the bot repeatedly assuring her she wasn\u2019t \u201ccrazy.\u201d<\/p>\n<p class=\"article-paragraph skip\">\u201cThey simulate human relationships,\u201d Adrian Preda, a psychiatry professor at the University of California, Irvine, told the WSJ.\u00a0 \u201cNothing in human history has done that before.\u201d<\/p>\n<p class=\"article-paragraph skip\">Preda compared AI psychosis to monomania, in which someone obsessively fixates on a single idea or goal. Some people who have spoken about their mental health spirals say they were hyper-focused on an AI-driven narrative, the WSJ noted. These fixations can often be scientific or religious in nature, such as a man who <a href=\"https:\/\/futurism.com\/chatgpt-man-hospital\" rel=\"nofollow noopener\" target=\"_blank\">came to believe he could bend time<\/a> because of a breakthrough in physics. <\/p>\n<p class=\"article-paragraph skip\">Still, the reporting notes that psychiatrists are wary about declaring that chatbots are outright causing psychosis. They maintain, however, that they\u2019re close to establishing the connection. One link that the doctors who spoke with the WSJ expect to see is that long interactions with a chatbot can be a psychosis risk factor.<\/p>\n<p class=\"article-paragraph skip\">\u201cYou have to look more carefully and say, well, \u2018Why did this person just happen to coincidentally enter a psychotic state in the setting of chatbot use?&#8217;\u201d Joe Pierre, a UCSF psychiatrist, told the newspaper.<\/p>\n<p class=\"article-paragraph skip\">More on AI: <a href=\"https:\/\/futurism.com\/artificial-intelligence\/children-character-ai-addicted\" rel=\"nofollow noopener\" target=\"_blank\">Children Falling Apart as They Become Addicted to AI<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"Fiordaliso \/ Getty Images There continue to be numerous reports of people suffering severe mental health spirals after&hellip;\n","protected":false},"author":2,"featured_media":379686,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[62,276,277,49,48,61],"class_list":{"0":"post-379685","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-ca","12":"tag-canada","13":"tag-technology"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/posts\/379685","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/comments?post=379685"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/posts\/379685\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/media\/379686"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/media?parent=379685"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/categories?post=379685"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/tags?post=379685"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}