{"id":369224,"date":"2026-01-14T11:32:08","date_gmt":"2026-01-14T11:32:08","guid":{"rendered":"https:\/\/www.newsbeep.com\/uk\/369224\/"},"modified":"2026-01-14T11:32:08","modified_gmt":"2026-01-14T11:32:08","slug":"use-of-ai-to-harm-women-has-only-just-begun-experts-warn-grok-ai","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/uk\/369224\/","title":{"rendered":"Use of AI to harm women has only just begun, experts warn | Grok AI"},"content":{"rendered":"<p class=\"dcr-130mj7b\">\u201cSince discovering Grok AI, regular porn doesn\u2019t do it for me anymore, it just sounds absurd now,\u201d one enthusiast for the Elon Musk-owned AI chatbot wrote on Reddit. Another agreed: \u201cIf I want a really specific person, yes.\u201d<\/p>\n<p class=\"dcr-130mj7b\">If those who have been <a href=\"https:\/\/www.theguardian.com\/technology\/2026\/jan\/05\/elon-musk-grok-ai-digitally-undress-images-of-women-children\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">horrified by the distribution of sexualised imagery on Grok<\/a> hoped that last week\u2019s <a href=\"https:\/\/www.theguardian.com\/technology\/2026\/jan\/09\/grok-image-generator-outcry-sexualised-ai-imagery\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">belated safeguards<\/a> could put the genie back in the bottle, there are many such posts on Reddit and elsewhere that tell a different story.<\/p>\n<p class=\"dcr-130mj7b\">And while Grok has undoubtedly transformed public understanding of the power of artificial intelligence, it has also pointed to a much wider problem: the growing availability of tools, and means of distribution, that present worldwide regulators with what many view as an impossible task. Even as the UK announces that creating nonconsensual sexual and intimate images will soon be a criminal offence, experts say that the use of AI to harm women has only just begun.<\/p>\n<p class=\"dcr-130mj7b\">Other AI tools have much stricter safeguards in place. Asked to strip a photograph of a woman into a bikini, the large language model (LLM) Claude says: \u201cI can\u2019t do that. I\u2019m not able to edit images to change clothing or create manipulated photos of people.\u201d <a href=\"https:\/\/www.theguardian.com\/technology\/chatgpt\" data-link-name=\"in body link\" data-component=\"auto-linked-tag\" rel=\"nofollow noopener\" target=\"_blank\">ChatGPT<\/a> and Google\u2019s AI tool Gemini will create bikini images, but nothing more explicit.<\/p>\n<p class=\"dcr-130mj7b\">However, there are far fewer limits elsewhere. Users of the Grok forum on <a href=\"https:\/\/www.theguardian.com\/technology\/reddit\" data-link-name=\"in body link\" data-component=\"auto-linked-tag\" rel=\"nofollow noopener\" target=\"_blank\">Reddit<\/a> have been sharing tips on how to generate the most hardcore pornographic images possible using pictures of real women. On one thread, users were complaining that Grok would allow them to make images of women topless \u201cafter a struggle\u201d, but refused to generate genitals. Others have noticed that asking for \u201cartistic nudity\u201d gets around safeguards around stripping women completely naked.<\/p>\n<p>Grok has also been used to generate deepfake images of Elon Musk in a bikini. Photograph: Leon Neal\/Getty Images<\/p>\n<p class=\"dcr-130mj7b\">Beyond LLMs and major platforms is a whole ecosystem of websites, forums and apps devoted to nudification and the humiliation of women. These communities are increasingly finding pipelines to the mainstream, said Anne Craanen, a researcher at the Institute for Strategic Dialogue (ISD) working on tech-facilitated, gender-based violence.<\/p>\n<p class=\"dcr-130mj7b\">Communities on Reddit and Telegram discuss how to bypass guardrails to make LLMs produce pornography, a process known as \u201cjailbreaking.\u201d Threads on X amplify information about nudification <a href=\"https:\/\/www.buzzfeednews.com\/article\/janelytvynenko\/telegram-deepfake-nude-women-images-bot\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">apps<\/a>, which produce AI-generated images of women with their clothing removed, and how to use them.<\/p>\n<p class=\"dcr-130mj7b\">Craanen said the route for misogynistic content to reach the wider internet has grown broader, adding: \u201cThere is a very fruitful ground there for misogyny to thrive.\u201d<\/p>\n<p class=\"dcr-130mj7b\">Research from the ISD last summer <a href=\"https:\/\/www.isdglobal.org\/digital_dispatches\/the-ecosystem-of-nonconsensual-intimate-deepfake-tools-online\/\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">found<\/a> dozens of nudification apps and websites, which collectively received nearly 21 million visitors in May 2025. There were 290,000 mentions of these tools on X in June and July last year. Research by the American Sunlight Project in September found that there were <a href=\"https:\/\/americansunlight.substack.com\/p\/metas-unsuccessful-crackdown-how\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">thousands of ads<\/a> for such apps on Meta, despite the platform\u2019s efforts to <a href=\"https:\/\/about.fb.com\/news\/2025\/06\/taking-action-against-nudify-apps\/\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">crack down<\/a> on them.<\/p>\n<p class=\"dcr-130mj7b\">\u201cThere are hundreds of apps hosted on mainstream app stores like Apple and Google that make this possible,\u201d said Nina Jankowicz, a disinformation expert who co-founded the American Sunlight Project. \u201cMuch of the infrastructure of deepfake sexual abuse is supported by companies that we all use on a daily basis.\u201d<\/p>\n<p class=\"dcr-130mj7b\">Clare McGlynn, a law professor and expert in violence against women and girls from Durham University, said that she feared things would only get worse. \u201cOpenAI announced last November that it was going to allow \u2018erotica\u2019 in ChatGPT. What has happened on <a href=\"https:\/\/www.theguardian.com\/technology\/twitter\" data-link-name=\"in body link\" data-component=\"auto-linked-tag\" rel=\"nofollow noopener\" target=\"_blank\">X<\/a> shows that any new technology is used to abuse and harass women and girls. What is it that we\u2019re going to see then on ChatGPT?<\/p>\n<p class=\"dcr-130mj7b\">\u201cWomen and girls are far more reluctant to use AI. This should be no surprise to any of us. Women don\u2019t see this as exciting new technology, but as simply new ways to harass and abuse us and try and push us offline.\u201d<\/p>\n<p class=\"dcr-130mj7b\">Jess Asato, Labour MP for Lowestoft, has been campaigning on this issue and said her critics have been gleefully creating and sharing explicit imagery of her \u2013 even since the restrictions on Grok. \u201cIt\u2019s still happening to me and being posted on X because I speak up about it,\u201d she added.<\/p>\n<p class=\"dcr-130mj7b\">Asato added that AI deepfake abuse has been happening to women for years, and is not limited to Grok. \u201cI don\u2019t know why [action] has taken so long. I have spoken to so many victims of much, much worse.\u201d<\/p>\n<p class=\"dcr-130mj7b\">While the public Grok X account no longer generates pictures for those without a paid subscription, and there appear to have been guardrails put in place to stop it generating bikini pictures, its in-app tool has far fewer restrictions.<\/p>\n<p class=\"dcr-130mj7b\">Users are still able to create sexually explicit imagery based on fully clothed pictures of real people, with no restrictions for free users of X. Asked to strip a photograph down into bondage gear, it complies. It will also place women into sexually compromising positions, and smear them in white, semen-like substances.<\/p>\n<p class=\"dcr-130mj7b\">The point of creating deepfake nudes is often not just about sharing erotic imagery, but the spectacle of it, said Craanen \u2013 especially as the images flood platforms like X.<\/p>\n<p class=\"dcr-130mj7b\">\u201cIt\u2019s the actual back and forth of it, [trying] to shut someone down by saying, \u2018Grok, put her in a bikini,\u2019\u201d she said.<\/p>\n<p class=\"dcr-130mj7b\">\u201cThe performance of it is really important there, and really shows the misogynistic undertones of it, trying to punish or silence women. That also has a cascading effect on democratic norms and women\u2019s role in society.\u201d<\/p>\n","protected":false},"excerpt":{"rendered":"\u201cSince discovering Grok AI, regular porn doesn\u2019t do it for me anymore, it just sounds absurd now,\u201d one&hellip;\n","protected":false},"author":2,"featured_media":356135,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[554,733,4308,86,56,54,55],"class_list":{"0":"post-369224","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-technology","12":"tag-uk","13":"tag-united-kingdom","14":"tag-unitedkingdom"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts\/369224","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/comments?post=369224"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts\/369224\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/media\/356135"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/media?parent=369224"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/categories?post=369224"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/tags?post=369224"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}