As AI becomes more prevalent in everyday life, students and experts have expressed concern with growing trends of using it to fulfill social and emotional needs. 

The AI industry has experienced massive growth over the past two years, including the introduction of AI companionship models. These systems, including Friend AI and Talkie AI, aim to provide users with a chatbot that’s available anytime to provide advice and emotional support. 

Many people are beginning to feel the real-world effects of this kind of AI usage. This August, OpenAI — the developer behind ChatGPT — is facing a lawsuit from parents alleging the model encouraged their late son’s suicide.

Ran An, a sixth-year developmental psychology Ph.D. candidate with research focused on AI, said AI inherently lacks the ability to create genuine connections, making it inadequate as any sort of companion. 

“AI can simulate empathy and generate supportive responses, but it doesn’t actually feel or form genuine attachments,” An said. “Emotional connection relies on shared human experiences — things like empathy, vulnerability and mutual understanding — which AI simply cannot replicate.”

An believes it’s especially important for young people to make authentic connections with others, and AI should not be used as a replacement.

“I don’t think AI should fulfill an emotional role in young people’s lives. Children and adolescents need real-human relationships — family, friends, teachers — to develop emotional understanding and empathy,” An said. “It’s important to treat AI as a supplementary tool, not a substitute for connection.”

An discussed the importance of academic institutions teaching responsible AI use, which is one of her research’s main focuses.  

“I think we’re at a stage where institutions and educators need to help students develop AI literacy — understanding how to use it critically and ethically, rather than just efficiently,” An said.

Hooman Rashidi, associate dean of AI in medicine, believes the rise of student AI use is simply due to increased accessibility.

“The delivery platform that is being shown to the audience is a lot more user-friendly than it used to be,” Rashidi said. “This is just the next phase of things we’ve been doing for the past few decades, except they’ve now accelerated dramatically.”

Rashidi said he feels dependence on AI could have a negative mental health effect if used excessively for personal issues or otherwise improperly.

“These [AIs] have a lot of capabilities and a lot of strengths, but they also have lots of limitations. There’s lots of biases to overcome,” Rashidi said. “There’s dependencies that people may want to minimize because they may seem too real.”

Another concern Rashidi expressed was younger people becoming dependent on AI, which he feels could lead to negative effects in the long term.

“As we adopt these tools, the critical thinking element is now being passed on partially to some generative AI tools,” Rashidi said. “What’s going to happen to these different generations that are coming through is they become pathologically dependent on certain AI frameworks.”

Abby Braun, a senior neuroscience major, discussed her personal AI usage, which mostly includes assistance with schoolwork and other academic help.

“I really like AI, personally,” Braun said. “It helps me with my schoolwork, and it’s a good tool. Sometimes, I use it to outline an essay, and I’ve been involved with a lot of research recently, so I’ve been having it help me find better sources.”

Although she applies AI frequently in her academics, Braun said she would never consider using it to fulfill any emotional needs.

“I would never use that in my life,” Braun said. “I don’t need to seek comfort in a fake chatbot. I think that’s a little bit weird, and it could lead to a lot of construed emotions.”

Nick Starace, a senior electrical engineering major, also said he uses AI to assist in his schoolwork, specifically for its explanations of more difficult concepts.

“I use AI to help me learn material — maybe sometimes to assist with assignments, but mostly for understanding, since it can articulate questions and subjects that are a little more dense.”

Starace said he disagrees with AI being used as stand-in for therapy, specifically due to its inhuman nature. 

“I think therapy should be something intimate, and something synthetic like AI isn’t helping to cure any problems,” Starace said. “I’m really not a fan of it at all.”

Yanshan Wang, leader of the Pitt Natural Language Processing and Artificial Intelligence Innovation Laboratory, said he feels that personal connections between humans and AI have applicability within health care, specifically in diagnosing mental health conditions.

“In health care, I think it is a useful tool to detect depression and a lot of mental health disorders,” Wang said. “People can use the [AI’s] response in a responsible way and develop AI algorithms to detect mental health disorders. Those tools are really powerful to improve and to actually address the current mental health crisis.”

Wang singled out older adults as a population who could benefit from AI connections that provide a sense of companionship and happiness.

“A lot of old adults are living alone, and they are quite lonely and need some emotional support. If they can talk with some of the AI companions and share their life stories, then it could improve their emotions,” Wang said.