Two of the teenagers behind the lawsuit are under the age of 18, but all three are withholding their names from the public in order to protect their privacy.
One of the young plaintiffs said she found out about the imagery after she received an anonymous message on Instagram pointing her toward images and videos, including her high school yearbook photo, which had been altered to show her in sexually explicit actions and full nudity.
The material was being shared on a Discord server, a private chat space on that platform, and included similar imagery that had also been altered using Grok of at least 18 other women who were minors, according to the complaint.
The other two women who are suing xAI also found fake sexually explicit imagery of them online, which was found to have been created via Grok.
Grok was launched in 2023 by Musk’s xAI. The company, along with X, is now part of Musk’s SpaceX company, which took over xAI last month.
Last year, xAI released what it called Grok Imagine or “spicy mode”, with features that allowed Grok users to prompt it to create fake images that were more sexual in nature.
The mode even carried out the “undressing” of real people using their images online, from Taylor Swift to more average users.
In less than two weeks, Grok had created millions of sexualized images, including more than 20,000 of children, according to a sampling, external of the images conducted by the Center for Countering Digital Hate.
Musk initially downplayed Grok’s ability to create fake sexualized content, saying in January he was “not aware of any naked underage images generated by Grok. Literally zero,” and putting the blame on users of the feature.
“Obviously, Grok does not spontaneously generate images, it does so only according to user requests”, Musk wrote on X.
As such online abuse continued this year, however, UK watchdog Ofcom, the European Commission and California each launched investigations into the feature’s ability to create sexualized images of real people, particularly children.
By mid-January, X said that it would implement “technological measures” to stop Grok’s ability to undress people in photos.
Eventually, the perpetrator behind the Discord server mentioned in the new lawsuit was arrested. He was not named in the lawsuit but is part of a separate police investigation.
That investigation discovered he had hundreds of AI-generated and altered sexual abuse images of minors, which were traded on the messaging platform Telegram and on the file-sharing platform Mega, according to the lawsuit.