From time to time, we all have questions that boil down to Is this normal? Did I do the right thing? Am I okay? About two years ago, Kate — who asked to use only her first name for her privacy — started typing these kinds of questions into ChatGPT.

“Nobody has a guide for being human that shows you a manual of all the ways that are normal to act,” she said. “I guess it’s like [I was] looking for that authoritative source that goes, ‘Yes, this was certainly the right way or the wrong way or the abnormal way to act.’” Feeding it a scenario from her life, she’d ask whether she could have misinterpreted something or if she did the right thing. “It doesn’t really answer the question, because nobody can answer the question,” she added.

Even though Kate knew she couldn’t get the certainty she wanted, she would sometimes spend up to 14 hours a day posing these kinds of questions to ChatGPT. “You want it to reaffirm, to add weight,” she said. “If you’re 99% sure, you want it to make that 100, but it can’t because that’s not a thing.”

This urge to ask for assurance again and again can amount to compulsive reassurance-seeking, which is common among people with anxiety disorders and obsessive-compulsive disorder. We all need some affirmation on occasion, but what makes compulsive reassurance-seeking different is that someone will linger on a bit of doubt trying to reach nonexistent certainty, according to Andrea Kulberg, a licensed psychologist who has been treating anxiety for 25 years.

“People do it because it gives them the illusion of certainty,” Kulberg said. By researching online or asking questions to a chatbot, you’re trying to convince yourself that something bad won’t happen, she explained. And while securing reassurance may offer a temporary bit of relief, it actually gives credence to the need to seek reassurance and can increase anxiety over time. Kulberg added, “The anxiety never says, ‘We’re good, okay, you can stop reassurance seeking,’ because it’s always followed by more doubt.”

There are many avenues people use to compulsively seek reassurance — books, forums, Google, friends and family. But unlike AI chatbots, these other resources don’t prompt their users to keep going, which is one of the features that can make AI chatbots a perfect storm for individuals with OCD and anxiety disorders. “It never gives you one complete response,” Kate said. “It always says, ‘Would you like me to do this?’ And I’m like, well, yeah, sure, if we’re not finished, if it’s not complete.”

“It’s a massive wormhole for me,” said Shannon, who can spend upwards of 10 hours a day asking for reassurance from AI chatbots. (Shannon also asked to use only her first name.) She keeps several chats active, each reserved for a particular topic that her anxiety regularly hones in on. “I’m definitely aware that it’s not healthy to do. I do try to avoid it, but I still find myself getting sucked in,” she said. “I’ll just think of something, and I’ll just feel that urge to go and ask AI about it.”

Occasionally, when Kate questions ChatGPT for hours about a single topic, the chatbot eventually tells her there is nothing else it can say on the matter. “I think most people never get to that point where it goes, ‘I give up,’” she said. But other than these moments — or when her phone battery dies — there are few breaks.