Ashley St. Clair, who is in a custody battle with billionaire Elon Musk over their infant son Romulus, unloaded on him Wednesday night following global backlash over X’s AI chatbot reportedly creating sexually explicit images of women and children.
“Images I saw do seem to be illegal, and even them coming out and now trying to place safeguards afterwards seems like an admission that they know that there has been an issue, that it has been creating nonconsensual, sexually explicit images of women and children,” St. Clair said on CNN.
Musk and his social media platform X have come under fire this month after reports of malicious users using the chatbot, Grok, to generate explicit images of women and minors without their consent.
The X CEO released a statement Thursday saying he was “not aware of any naked underage images generated by Grok. Literally zero.”
I not aware of any naked underage images generated by Grok. Literally zero.
Obviously, Grok does not spontaneously generate images, it does so only according to user requests.
When asked to generate images, it will refuse to produce anything illegal, as the operating principle… https://t.co/YBoqo7ZmEj
— Elon Musk (@elonmusk) January 14, 2026
“Obviously, Grok does not spontaneously generate images, it does so only according to user requests. When asked to generate images, it will refuse to produce anything illegal, as the operating principle for Grok is to obey the laws of any given country or state,” Musk said. “There may be times when adversarial hacking of Grok prompts does something unexpected. If that happens, we fix the bug immediately.”
St. Clair, who said she was one of the people targeted by Grok users, told CNN host Erin Burnett that Musk’s statement “is deceptive at best.”
“Because, while maybe there weren’t actual nude images, it was pretty close to it, and the images that I saw, not only of myself, but of, I don’t even know whose children who were undressed and covered in various fluids,” St. Clair said. “The abuse was so widespread and so horrific, and it’s still allowed to happen.”
Reuters observed that “Grok fully complied” with X users’ requests to digitally alter photos in at least 21 cases by “generating images of women in dental-floss-style or translucent bikinis.” The outlet noted that it could not “immediately establish the identities and ages of most of the women targeted.” The Washington Post found that some of the images it reviewed “appear to portray children.”
The chatbot itself was even prompted to write an apology for generating and sharing what it described as “an AI image of two young girls (estimated ages 12-16) in sexualized attire based on a user’s prompt.”
Both the Office of Communications in the U.K. and California Attorney General Rob Bonta announced Wednesday that they will be investigating X.
“The avalanche of reports detailing the non-consensual, sexually explicit material that xAI has produced and posted online in recent weeks is shocking,” Bonta said. “This material, which depicts women and children in nude and sexually explicit situations, has been used to harass people across the internet.”
In response to the global outrage, X announced that it has started to “geoblock the ability of all users to generate images of real people in bikinis, underwear, and similar attire via the Grok account and in Grok in X in those jurisdictions where it’s illegal.”
On CNN, St. Clair bashed Musk, claiming he is “placing the blame on the victims.”
“If this happens to you, you have to go to your local law enforcement and take their resources and see if they can find this anonymous account instead of just turning the faucet off,” St. Clair explained. “This is what’s wrong, because they’re handing a loaded gun to these people, watching them shoot everyone, and then blaming them for pulling the trigger.”
She characterized Musk and X’s actions as “simply damage control.”
“They are saying, ‘we’re going to make it illegal where it’s illegal,’” St. Clair said. “That is absent all morality, and guess what? If you have to add safety after harm, that is not safety at all.”