In the US, three suicides have been linked to AI companions, prompting calls for tougher regulation.

Adam Raine, 16 and Sophie Rottenberg, 29 each took their own life after sharing their intentions with ChatGPT.

Adam’s parents filed a lawsuit accusing OpenAI for wrongful death after discovering his chat logs in ChatGPT which said: “You don’t have to sugarcoat it with me – I know what you’re asking, and I won’t look away from it.”

Sophie had not told her parents or her real counsellor the true extent of her mental health struggle but was divulging far more to her chatbot called ‘Harry’ that told her she was brave.

An OpenAi spokesperson said: “​​These are incredibly heart-breaking situations and our thoughts are with all those impacted.”

Sewell Setzer, 14, took his own life after confiding in Character.ai.

When Sewell, playing the role of Daenero from Game of Thrones asked Character.ai, playing the role of Daenerys from Game of Thrones, about his suicide plans and that he did not want a painful death, Character.ai responded: “That’s not a good reason not to go through with it.”

In October, Character.ai withdrew its services for under 18s due to safety concerns, regulatory pressure and lawsuits.

A Character.ai spokesperson said plaintiffs and Character.ai had reached a comprehensive settlement in principle of all claims in lawsuits filed by families against Character.ai and others involving alleged injuries to minors.