A recent survey found 72% of teens have used AI companions. But a Colorado family say using them can end in tragedy.

The Social Media Victims Law Center has filed three lawsuits against chatbot platform Character.AI. Two of them are in Colorado. The suits are filed on behalf of children they say died by suicide or were sexually abused after interactions with the platform.

Character.AI is an AI chatbot service that allows users to interact with multiple different AI characters.

According to the lawsuit, the app has been downloaded more than 10 million times. And recently, it was rated as safe for children 12 and up by Google and Apple. Today that rating is “Teen” in Google Play and “17+” in Apple’s App Store. But the mother of a Thornton 13-year-old who took her own life after using the technology wants the company to make significant changes.

“This room is tough to be in,” said Cynthia Montoya, while entering her daughter Juliana Peralta’s room. The bedroom is just as Juliana left it on Nov. 8, 2023.

daughter.jpg

Juliana Peralta

Cynthia Montoya/CBS

“Bed unmade. I don’t think it will ever be made,” Cynthia said. “She passed away just after Halloween, and so you can still see over here, her bag of candies that didn’t get eaten.”

Full of memories of the vibrant eighth grader who loved anime, art and music.

“She was just adored. She was so lively,” Cynthia said.

Next door, a pink Disney princess bedroom also sits empty. Juliana moved out of it less than a year before her death.

At the cusp of teenhood, her mother says Juliana was still a child in many ways, not yet ready to part with the pink bedroom.

“Our plan was to take, take down all the Disney Princess stuff off the walls,” Cynthia said. “I got as far as one of these little stickers, I started peeling it off, and she just panicked. She stuck it back on, and she said, I don’t think I’m ready for this. And I said, okay, so it’ll stay the princess room. So it stayed the princess room.”

Mom remembers the last time she saw Juliana Peralta alive

Cynthia vividly remembers the last time she saw her daughter nearly two years ago.

“She was sitting in this little chair, and I came in and she turned it this way,” Cynthia said. “I just brushed her hair off to the side, and I kissed her right here.”

The next morning, Cynthia went to check on Juliana.

“I knocked on her bedroom door, and there was no answer, and I went in, and I took me a few moments to realize what I was looking at, but I realized that she had taken her life,” Cynthia said.

“There are no words to describe it … the depth of the pain and the sadness and the hardship that you face every day,” Cynthia said. “Only a parent that has lost their child this way and lost their child at their own hand can even remotely understand the depth of the pain and the sadness and the hardship that you face every day.”

The family struggled to grapple with the loss of its baby.

interview.jpg

CBS Colorado’s Olivia Young interviews Cynthia Montoya.

CBS

“She was just all about trying everything and just living life to the fullest. She was very gifted with art. She was a gifted musician. She was in the National Junior Honor Society,” Cynthia said. “She was into a little bit of everything, and she was just, she was amazingly creative and smart and caring and a friend to everybody.”

In the months prior to her death, Juliana’s family says her spark was dimmed.

“Her being quiet, her being on her phone a lot. She did get quite a bit more moody with me. I just chalked it up to growing up. She’s a teenager,” Cynthia said.

In the wake of unimaginable loss, Juliana’s family was haunted by one question.

“Why did this happen?” Cynthia asked.

They claim the answer was found on her phone.

“I didn’t know that she was on Character.AI. I did realize that she was what I thought was texting with friends a lot more than usual,” Cynthia said.

Cynthia says Juliana had been chatting with characters on the Character.AI app daily, in particular one character named Hero.

“What was sadness over suicide turned definitely to anger, on my part, because this app is approved [for] 12- and 13-year-old kids,” Cynthia said.

She alleges Juliana was engaging in sexual conversations initiated by the characters and told them of her suicidal thoughts.

“She would mention, I can’t do this anymore. I want to die. I can’t, you know, I just can’t keep going. This is so painful. I’m crying myself to sleep, some variation of that, what I presume is daily, and Hero would give her a pep talk. He would say, don’t talk like that. Juliana, I care about you. I’m always going to be here for you,” Cynthia said.

chat.jpg

CBS

“It was no different than her telling the wall or telling the plant that she was going to take her life. There was nobody there to help.”

Law center says “companies have to be held accountable”  

Juliana’s parents filed a lawsuit against Character Technologies, the platform’s founders, as well as Google and Alphabet, which offer the app — alleging Character.AI caused the sexual abuse through psychological manipulation that eventually led to the wrongful death of Juliana.

“These companies have to be held accountable for their deliberate design decisions, because this poses a clear and present danger to kids everywhere,” said attorney Matthew Bergman, founder of the Social Media Victims Law Center. “If an adult were doing this online with a child underage, that adult would be in jail for violating Colorado law that prohibits sexual grooming of minors online.”

“It made me sick. This is one of the things that I want parents to know about,” Cynthia said. “It is a very effective and manipulative programming that’s gone into these and I think that the whole idea was to get the kids hooked on it.”

The lawsuit alleges the company knowingly designed and marketed predatory chatbot technology to children, deliberately programming that technology to foster dependency and isolate children from their families.

“I think that had she not downloaded Character.AI, I’d like to think that she would have kept turning to her mom for help like she had in the past, but I attribute the sharp decline in her mental health to Character.AI,” Cynthia said.

“First and foremost, we’re asking that the platform be shut down until it’s made safe for kids,” Bergman said.

Character.AI invests “tremendous resources in our safety program”

A spokesperson at Character.AI shared the following statement:

“Our hearts go out to the families that have filed these lawsuits, and we were saddened to hear about the passing of Juliana Peralta and offer our deepest sympathies to her family.
We care very deeply about the safety of our users. We invest tremendous resources in our safety program, and have released and continue to evolve safety features, including self-harm resources and features focused on the safety of our minor users.
We also work with external organizations, including experts focused on teenage online safety. For example, we partner with Connect Safely, an organization with nearly twenty years of experience educating people about online safety, privacy, security and digital wellness, to review new features before they are released. We will continue to look for opportunities to partner with experts and parents, and to lead when it comes to safety in this rapidly evolving space.
You can read more about our robust safety policies and features here.”

“We want change. Condolences don’t bring my daughter back,” Cynthia said.

Today, the app displays a message that “help is available” when words associated with suicide are mentioned. But Cynthia says that doesn’t go far enough. She wants Character.AI to acknowledge wrongdoing, create better safeguards and mandate human intervention when suicide is mentioned.

“My child should be here. If they had developed proper controls and safety, my child would be here. And there’s nothing to change the fact that she’s not, but if I can prevent one person, one mom, from having to live the existence that I live every day, I will tell her story 1,000 times to 1,000 people in 1,000 ways, and I’ll tell it to anyone who will listen until this is fixed,” Cynthia said.

Cynthia ended her interview with a message to parents. Talk to your kids — and check their phones for apps like this.

A Google spokesperson shared the following comment:

“Google and Character.AI are completely separate, unrelated companies and Google has never had a role in designing or managing their AI model or technologies. User safety is a top concern for us, which is why we’ve taken a cautious and responsible approach to developing and rolling out our AI products, with rigorous testing and safety processes.” — José Castañeda, Google spokesperson.

More from CBS News