X has come under intense scrutiny from users, politicians and regulators worldwide over Grok being used to make non-consensual sexualised imagery of people.

Users had been able to tag the Grok account in posts or replies to posts on the platform and ask it to edit images to undress people.

Grok complied with many such requests to produce photo-realistic images of real women in bikinis and revealing clothing – with reports it also produced sexualised images of children.

On Wednesday, before her court filing, Ms St Clair told BBC Newsnight her image had been “stripped” to appear “basically nude, bent over” despite her telling Grok she did not consent to the sexualised images.

She, and other women whose images were edited using Grok, had said the site was not doing enough to tackle illegal content, including child sexual abuse imagery.

Following a backlash, X changed its rules so only paid users could use the function – sparking criticism from women’s groups and the UK government.

The company said on Wednesday, external that all X users would no longer be able to edit photos of real people to show them in revealing clothing in jurisdictions where it was illegal.

It later updated its post to say it would implement “similar geoblocking measures for the Grok app”, which is separate to X.

On Friday, The Guardian reported, external that it was still possible to use the standalone Grok app to generate sexualised deepfakes of real people and post them on X “without any sign of it being moderated”.

The UK government is bringing into force a law which will make it illegal to create non-consensual intimate images, and regulator Ofcom is still probing whether X broke existing UK laws.