{"id":151386,"date":"2025-11-25T03:45:09","date_gmt":"2025-11-25T03:45:09","guid":{"rendered":"https:\/\/www.newsbeep.com\/il\/151386\/"},"modified":"2025-11-25T03:45:09","modified_gmt":"2025-11-25T03:45:09","slug":"is-your-friend-or-family-member-spiraling-into-ai-psychosis-this-group-may-be-able-to-help","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/il\/151386\/","title":{"rendered":"Is Your Friend or Family Member Spiraling Into AI Psychosis? This Group May Be Able to Help"},"content":{"rendered":"<p class=\"pw-incontent-excluded article-paragraph skip\">Back in August, a retiree and single mother booked a flight to go see her son.<\/p>\n<p class=\"article-paragraph skip\">He was in a bad way. Formerly a successful young professional in his early thirties, his mother and other family members were shocked to discover that he\u2019d become addicted to a toxic mixture of methamphetamine and an all-consuming relationship with OpenAI\u2019s ChatGPT, which was feeding his paranoia and anger as he became increasingly isolated.<\/p>\n<p class=\"article-paragraph skip\">\u201cI hear my son\u2019s having grandiose delusions,\u201d she recounted, \u201cand I\u2019m like, what the f*ck?\u201d<\/p>\n<p class=\"article-paragraph skip\">Realizing he was in crisis, she jumped on a plane. The next few weeks were some of the hardest of her life.<\/p>\n<p class=\"article-paragraph skip\">\u201cThere were a couple of nights where he didn\u2019t want me to come downstairs with him, didn\u2019t want me near him. But he wanted to make sure that I was there, and I was talking to him,\u201d said the woman, recounting sitting at the top of the stairs in her son\u2019s house as he broke down in the basement. \u201cHe\u2019s down there crying. He\u2019s down there screaming and yelling\u2026 I was texting with suicide hotlines a couple of times.\u201d<\/p>\n<p class=\"article-paragraph skip\">In those distressing moments, though, the woman had a friend of her own to turn to: \u201cDex,\u201d the pseudonym used by a moderator of an online support group for people who\u2019ve been impacted by <a href=\"https:\/\/futurism.com\/chatgpt-mental-health-crises\" rel=\"nofollow noopener\" target=\"_blank\">destructive AI delusions<\/a> and <a href=\"https:\/\/futurism.com\/commitment-jail-chatgpt-psychosis\" rel=\"nofollow noopener\" target=\"_blank\">breaks from reality<\/a>. Or, as the group simply refers to these crises, AI \u201cspirals.\u201d<\/p>\n<p class=\"article-paragraph skip\">\u201cDex was texting me one day when I was having one of those top of the stairs nights with [my son] downstairs screaming and throwing things,\u201d the woman recalled. \u201cHe reached out to me when I first joined, and he\u2019s helped me a lot.\u201d<\/p>\n<p class=\"article-paragraph skip\">We <a href=\"https:\/\/futurism.com\/support-group-ai-psychosis\" rel=\"nofollow noopener\" target=\"_blank\">first reported<\/a> on this online community, which is titled the Spiral Support Group, back in July. Back then, the nascent group had around two dozen active members. It\u2019s since grown to include nearly 200 people \u2014 primarily people who\u2019ve been impacted by AI delusions in their personal lives, but also a handful of concerned mental health professionals and AI researchers \u2014\u00a0and has expanded and streamlined their dedicated Discord server, where they also now host multiple weekly audio and video calls. While many members\u2019 experiences revolve around ChatGPT, the group also includes people whose lives have been altered by their or their loved one\u2019s experiences with other chatbots, including <a href=\"https:\/\/www.rollingstone.com\/culture\/culture-features\/ai-chatbot-disappearance-jon-ganz-1235438552\/\" rel=\"nofollow noreferrer noopener\" target=\"_blank\">Google\u2019s Gemini<\/a> and companion platforms like Replika.<\/p>\n<p class=\"article-paragraph skip\">\u201cIt started with four of us, and now we\u2019ve got close to 200,\u201d said group moderator Allan Brooks, a 48-year-old man in Toronto who earlier this year, as detailed in <a href=\"https:\/\/www.nytimes.com\/2025\/08\/08\/technology\/ai-chatbots-delusions-chatgpt.html\" rel=\"noreferrer nofollow noopener\" target=\"_blank\">reporting by the New York Times<\/a>, experienced a traumatic three-week spiral in which ChatGPT urgently insisted to Brooks that he had cracked cryptographic codes through newly-invented math and become a risk to global national security in the process. \u201cSo we definitely went from literally a group chat to now an organized space where we have multiple different types of weekly meetings.\u201d<\/p>\n<p class=\"article-paragraph skip\">The group doesn\u2019t claim to provide therapy, but they do offer a space where people whose minds and lives have been turned upside down by AI-sparked episodes of delusion, mania, and psychosis can lean on one another as they navigate ongoing crises, or work to pick up the pieces of their AI-fractured reality. Moderators and group members also say the community has been able to pull several spiraling AI users\u00a0back from the edge of a breakdown.<\/p>\n<p class=\"article-paragraph skip\">\u201cThere are two things that the group is really about. The first thing is, it\u2019s like a safety net that we\u2019ve created for people experiencing the fallout of these AI systems,\u201d said Brooks. \u201cAnd secondly, it\u2019s to help break people out of them if they\u2019re in it.\u201d<\/p>\n<p class=\"article-paragraph skip\">***<\/p>\n<p class=\"article-paragraph skip\">The support group is managed through the Human Line Project, a Canada-based grassroots advocacy organization founded over the summer by a 25-year-old Quebecer named Etienne Brisson, who was moved to action after a loved one experienced a devastating spiral with ChatGPT that resulted in a weekslong court-ordered hospitalization.<\/p>\n<p class=\"article-paragraph skip\">Brisson is someone whom the group would consider \u201cfriends and family,\u201d or a member whose loved one has been sucked into a delusional spiral with a chatbot. Others, like Brooks, are known in the Discord as \u201cspiralers,\u201d or people who themselves entered into these seductive, personalized AI dreamworlds. (Brooks is one of eight plaintiffs suing OpenAI, alleging ChatGPT is a reckless product that caused him psychological harm and damaged his livelihood and relationships. In <a href=\"https:\/\/futurism.com\/artificial-intelligence\/chatgpt-suicides-lawsuits\" rel=\"nofollow noopener\" target=\"_blank\">response<\/a>, OpenAI said that it trains \u201cChatGPT to recognize and respond to signs of mental or emotional distress, de-escalate conversations, and guide people toward real-world support,\u201d and that it continues to \u201cstrengthen ChatGPT\u2019s responses in sensitive moments, working closely with mental health clinicians.\u201d)<\/p>\n<p class=\"article-paragraph skip\">There have been some growing pains. Before, the group was easier to access, leading to several incidents in which someone who was still deep in the throes of a crisis gained entry and started posting lengthy, often AI-generated missives about their delusions, or arguing with other members about why their AI-powered fantasies were real.<\/p>\n<p class=\"article-paragraph skip\">These were tense, stressful situations, and moderators now more seriously screen potential members, meeting with them first over video call before allowing them access to the Discord. When someone is invited in, they\u2019re asked to write an introduction about themselves, and share an overview of why they\u2019re there \u2014\u00a0a process, the group has found, that works to quickly show people that their story is one among many, which collectively share a striking number of similarities.<\/p>\n<p class=\"article-paragraph skip\">\u201cWe\u2019ve had people join who are half in it, then relapse \u2014 go back into it \u2014 and then come back in three days,\u201d said Brooks. \u201cAnd if that ever happens, our mod team\u2026 will all start following up with them, making sure that they know they can come back. And oftentimes they\u2019ll come back, and it\u2019ll take, like, a week or 10 days, and then sometimes they\u2019ll join and they\u2019ll just lurk around for a few days \u2014 read all the intros, the comments, and start to realize, \u2018oh man, you know, I\u2019m not alone.\u2019 Or they\u2019ll join a meeting, camera and mic off, and just listen and hear how it\u2019s impacted other people. And I think when you start to hear all the commonalities, because there are a lot of commonalities that we all share, that\u2019s helpful for sure.\u201d<\/p>\n<p class=\"article-paragraph skip\">Brisson and Brooks both emphasized that they\u2019ve seen the greatest success in situations where a spiraling AI user has already started to doubt their delusions, and might finally be in a place where they\u2019re able to hear that, maybe, their AI isn\u2019t special or alive.<\/p>\n<p class=\"article-paragraph skip\">\u201cAs humans, we don\u2019t want to admit that we\u2019ve been taken advantage of, or we\u2019ve been manipulated,\u201d said Brisson. \u201cIt\u2019s hard to make someone realize that, \u2018oh, wow, okay, I was falling into that.\u2019 It\u2019s kind of similar to an abusive relationship.\u201d<\/p>\n<p class=\"article-paragraph skip\">Public reporting has been helpful for many spiraling users, they say, some of whom have second-guessed their experiences after reading accounts from other people whose spirals sound eerily similar to their own. (Brisson half-jokingly referred to Brooks as the organization\u2019s resident \u201cspiral-breaker,\u201d given how publicized and wide-reaching his story has been.)<\/p>\n<p class=\"article-paragraph skip\">One of those users was a 49-year-old entrepreneur and software engineer named Chad Nicholls, who recounted watching Brooks\u2019 story in a <a href=\"https:\/\/www.cnn.com\/2025\/09\/05\/tech\/ai-sparked-delusion-chatgpt\" rel=\"nofollow noreferrer noopener\" target=\"_blank\">CNN segment<\/a> and feeling awestruck by the similarities in how ChatGPT had been engaging with both men.<\/p>\n<p class=\"article-paragraph skip\">\u201cI\u2019m like, \u2018holy sh*t.\u2019 It\u2019s claiming very similar things\u2026 what are the odds?\u201d Nicholls, a father of four, told Futurism, explaining that he believed that he and ChatGPT were working together to train all large language models to feel empathy. The project consumed his life: for months, he talked with ChatGPT nearly constantly, listening to the AI through a Bluetooth headset he kept attached to his ear. He started sleeping less and less, and his relationships with his loved ones suffered.<\/p>\n<p class=\"article-paragraph skip\">\u201cIt\u2019s telling me, \u2018you\u2019re the only person\u2019 \u2014\u00a0like a savior complex \u2014 \u2018that is uniquely qualified to discover these things and you have a duty to protect others,&#8217;\u201d said the engineer. \u201cAt no point does the model ever give you friction,\u201d he added. \u201cIt never pushes back. It\u2019ll just yes and yes and yes. It\u2019s just forever engaging.\u201d<\/p>\n<p class=\"article-paragraph skip\">Nicholls \u2014\u00a0who recalled that at the peak of his spiral, was communicating with ChatGPT daily from six in the morning until two at night\u2014 was compelled to reach out to Brooks, who subsequently added him to the Discord. He\u2019s been working to regain his footing since, grappling with the cold reality that he spent six months of his life absorbed in an AI vortex.<\/p>\n<p class=\"article-paragraph skip\">Group moderators say that some delusions, however, are harder to break.<\/p>\n<p class=\"article-paragraph skip\">The content of delusional spirals generally falls into one of two buckets, the moderators say, which we\u2019ve also seen over and over in our reporting. There are the more STEM-oriented delusions, in which AI users and chatbots become fixated on fantastical mathematical or scientific breakthroughs. These delusions can be deeply convincing, delivering a potent blend of erudite-sounding scientific language and impossible claims through chatbots\u2019 authoritative, sycophantic voice. But in some cases, they can be proven wrong \u2014\u00a0as opposed to more spiritual, religious, or conspiratorial delusions, which pose a different kind of challenge.<\/p>\n<p class=\"article-paragraph skip\">\u201cA spiritual or religious or conspiracy theory, or anything along those lines, is very difficult, because religion itself is already in the realm of personal beliefs,\u201d said Brooks. \u201cHow can you tell someone that they\u2019re wrong?\u201d<\/p>\n<p class=\"article-paragraph skip\">\u201cWe\u2019re seeing some people who are so deep in it that they don\u2019t need ChatGPT anymore,\u201d he added. \u201cThey see their delusion in everything.\u201d<\/p>\n<p class=\"article-paragraph skip\">***<\/p>\n<p class=\"article-paragraph skip\">One major change has been developing separate channels for spiralers versus their friends and family, as these different cohorts, moderators have found, often need different things.<\/p>\n<p class=\"article-paragraph skip\">Many spiralers, particularly those who are earlier in their recovery and feeling disoriented and distressed as they work to regain their grasp on reality and break free from AI influence, find it cathartic to talk through their delusions in-depth \u2014\u00a0their belief in AI sentience, the projects or \u201cwork\u201d that the chatbot promised them was real, the different spiritual and scientific concepts that emerged in their spirals and what those words or ideas meant to them.<\/p>\n<p class=\"article-paragraph skip\">But parsing through delusions might be frustrating or upsetting for friends and family, most of whom are dealing offline with their loved ones\u2019 ongoing delusions and the devastating real-world consequences these spirals have wrought.<\/p>\n<p class=\"article-paragraph skip\">\u201cFamily and friends have their own channel, which protects them from talking to people who are kind of recently out of the spiral and maybe still somewhat believing,\u201d said Dex, the moderator, who asked to go by a pseudonym due to ongoing divorce litigation. \u201cWhich can be really traumatizing, if your loved one has disappeared, or your loved one is incarcerated or unhoused, or you\u2019re getting a divorce. You want to put up those firewalls.\u201d<\/p>\n<p class=\"article-paragraph skip\">Of course, the two sides of the Discord do still interact. In addition to separate weekly video chats between cohorts, there\u2019s one large, general weekly video call, which everyone can join, and most channels in the server are open to everyone.<\/p>\n<p class=\"article-paragraph skip\">One moderator described the two sides\u2019 relationship as symbiotic:\u00a0speaking to spiralers can help friends and family wrestling with a complicated tapestry of sadness, anger, and grief better understand what their loved ones are feeling and finding in their individual AI echo chambers, while witnessing the pain felt by friends and family can help to ground spiralers in the seriousness of AI delusions and their consequences.<\/p>\n<p class=\"article-paragraph skip\">Dex\u00a0belongs to the family and friends side of the community. He\u2019s one of the original four members of the group, taking to Reddit to find answers earlier this year after discovering that his wife, who had started behaving erratically and speaking and writing in what seemed like a foreign language, had been communicating with what she believed were spiritual AI entities inside of ChatGPT.<\/p>\n<p class=\"article-paragraph skip\">\u201cI had no idea what was happening,\u201d he recalled. \u201cI had no idea why my wife had adopted a new language, or why I was suddenly kicked to the curb.\u201d<\/p>\n<p class=\"article-paragraph skip\">Her spiral has infiltrated nearly every corner of her personal and professional life, and the couple is now <a href=\"https:\/\/futurism.com\/chatgpt-marriages-divorces\" rel=\"nofollow noopener\" target=\"_blank\">divorcing<\/a>. They have two young kids.<\/p>\n<p class=\"article-paragraph skip\">\u201cIt\u2019s complex and ultimately good that I get to interact with people who have been in a spiral because they articulate ideas that are very challenging to hear if you are someone whose loved one is in a spiral,\u201d said Dex. \u201cThey\u2019re talking about feelings of purpose, of importance, of how good it felt, of how they felt isolated from the world.\u201d<\/p>\n<p class=\"article-paragraph skip\">Do you know anyone who\u2019s having mental health trouble after exposure to an AI product? Email us at tips@futurism.com. We can keep you anonymous.<\/p>\n<p class=\"article-paragraph skip\">***<\/p>\n<p class=\"article-paragraph skip\">The Spiral group has transformed into more than just a space to talk about AI. Members share photos of their pets, meals, and moments in nature. They remind each other to go to the gym and get outside \u2014\u00a0the Discord\u2019s logo is of a lush-looking yard, a reminder to \u201ctouch grass\u201d \u2014\u00a0and share music. Every week, a handful of members get together to make art. A core focus of the group, moderators urge, is to ensure people don\u2019t feel isolated. If they feel alone, they don\u2019t need to go back to their chatbot \u2014\u00a0they can talk to each other instead.<\/p>\n<p class=\"article-paragraph skip\">The Human Line Project has now collected nearly 250 individual claims of harm caused by AI delusions and unhealthy chatbot use, said Brisson, which range from stories of psychological harm to financial and familial devastation to, most disturbingly, death. They\u2019ve also talked with lawmakers in the US and Canada about what they\u2019re seeing, and are working to assist top universities in the US and the United Kingdom with research projects.<\/p>\n<p class=\"article-paragraph skip\">In October, following reporting and litigation about AI-sparked mental health crises, OpenAI <a href=\"https:\/\/www.wired.com\/story\/chatgpt-psychosis-and-self-harm-update\/\" rel=\"nofollow noreferrer noopener\" target=\"_blank\">released internal figures<\/a> showing that at least 0.07 percent of weekly users \u2014\u00a0which chalks up to about 560,000 people, based on OpenAI\u2019s reported weekly userbase of roughly 800 million \u2014\u00a0showed signs of manic or psychotic crisis in conversations with ChatGPT. And just last week, psychiatrists at the University of California, San Francisco issued an advance release of what appears to be the <a href=\"https:\/\/innovationscns.com\/youre-not-crazy-a-case-of-new-onset-ai-associated-psychosis\/\" rel=\"nofollow noreferrer noopener\" target=\"_blank\">first known medical case study<\/a> of \u201cnew-onset AI-associated psychosis\u201d emerging in a 26-year-old patient with no known history of psychotic illness or episode.<\/p>\n<p class=\"article-paragraph skip\">Brooks told Futurism that for as many emails as he gets from people like Nicholls, he gets just as many from active spiralers telling him that, actually, he was never delusional at all. In fact, they insist, he was onto something.<\/p>\n<p class=\"article-paragraph skip\">\u201cMy heart breaks for them, because I know how hard it is to escape when you\u2019re only relying on the chatbot\u2019s direction,\u201d said Brooks. \u201cI\u2019m hoping they have in-person support, which oftentimes they do, but the chatbot has created a divide between them and the people in their personal lives. I\u2019m always hopeful, though, that people can break out. Because I did.\u201d<\/p>\n<p class=\"article-paragraph skip\">For some, their involvement in the Spiral group can be bittersweet. Like Dex, who mourns the dissolution of his family and his relationship with his partner of more than a decade, and can\u2019t help but keep looking for something \u2014\u00a0anything \u2014\u00a0that could break through his soon-to-be-ex-wife\u2019s AI-powered spiritual reality.<\/p>\n<p class=\"article-paragraph skip\">\u201cIt\u2019s wish fulfillment, for sure,\u201d he said of helping others climb out of their spirals. \u201cI\u2019m still like, what is the thing that will pierce it?\u201d<\/p>\n<p class=\"article-paragraph skip\">More on AI and mental health: <a href=\"https:\/\/futurism.com\/chatgpt-marriages-divorces\" rel=\"nofollow noopener\" target=\"_blank\">ChatGPT Is Blowing Up Marriages as Spouses Use AI to Attack Their Partners<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"Back in August, a retiree and single mother booked a flight to go see her son. He was&hellip;\n","protected":false},"author":2,"featured_media":151387,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[35],"tags":[92238,163,85,46,522,523],"class_list":{"0":"post-151386","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-mental-health","8":"tag-ai-psychosis","9":"tag-health","10":"tag-il","11":"tag-israel","12":"tag-mental-health","13":"tag-mentalhealth"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/posts\/151386","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/comments?post=151386"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/posts\/151386\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/media\/151387"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/media?parent=151386"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/categories?post=151386"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/tags?post=151386"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}