{"id":460825,"date":"2026-02-08T01:49:12","date_gmt":"2026-02-08T01:49:12","guid":{"rendered":"https:\/\/www.newsbeep.com\/ca\/460825\/"},"modified":"2026-02-08T01:49:12","modified_gmt":"2026-02-08T01:49:12","slug":"victims-urge-tougher-action-on-deepfake-abuse-as-new-law-comes-into-force-deepfake","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/ca\/460825\/","title":{"rendered":"Victims urge tougher action on deepfake abuse as new law comes into force | Deepfake"},"content":{"rendered":"<p class=\"dcr-130mj7b\">Victims of deepfake image abuse have called for stronger protection against AI-generated explicit images, as the law criminalising the creation of non-consensual intimate images comes into effect.<\/p>\n<p class=\"dcr-130mj7b\">Campaigners from Stop Image-Based Abuse delivered a <a href=\"https:\/\/www.change.org\/p\/deepfake-sexual-abuse-is-not-porn-demand-action-to-stop-image-based-abuse?utm_source=National-media&amp;utm_campaign=stop-deepfake-sexual-abuse&amp;utm_content=Womans-Rights&amp;utm_term=0-99&amp;utm_medium=National-petition\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">petition<\/a> to Downing Street with more than 73,000 signatures, urging the government to introduce civil routes to justice such as takedown orders for abusive imagery on platforms and devices.<\/p>\n<p class=\"dcr-130mj7b\">\u201cToday\u2019s a really momentous day,\u201d said Jodie, a victim of deepfake abuse who uses a pseudonym.<\/p>\n<p class=\"dcr-130mj7b\">\u201cWe\u2019re really pleased the government has put these amendments into law that will definitely protect more women and girls. They were hard-fought victories by campaigners, particularly the consent-based element of it,\u201d she added.<\/p>\n<p class=\"dcr-130mj7b\">In the petition, campaigners are also calling for improved relationships and sex education, as well as adequate funding for specialist services, such as the Revenge Porn Helpline, which support intimate image abuse victims.<\/p>\n<p class=\"dcr-130mj7b\">Jodie, who is in her 20s, discovered images of her being used as deepfake pornography in 2021. She and 15 other women testified against the perpetrator, 26-year-old Alex Woolf, after he posted images of women from social media to porn websites. He was convicted and sentenced to 20 weeks in prison.<\/p>\n<p class=\"dcr-130mj7b\">\u201cI had a really difficult route to getting justice because there simply wasn\u2019t a law that really covered what I felt had been done to me,\u201d said Jodie.<\/p>\n<p class=\"dcr-130mj7b\">The offence against creating explicit deepfake images was introduced as an amendment to the Data (Use and Access) Act 2025. While the law received royal assent last July, the offence was not enforced until Friday.<\/p>\n<p class=\"dcr-130mj7b\">Many campaigners, including Jodie, were frustrated by delays to the law coming into effect. \u201cWe had these amendments ready to go with royal assent before Christmas,\u201d said Jodie. \u201cThey should have brought them in immediately. The delay has caused millions more women to become victims, and they won\u2019t be able to get the justice they desperately want.\u201d<\/p>\n<p class=\"dcr-130mj7b\">In January, Leicestershire police opened an investigation into a case involving sexually explicit deepfake images that were created by Grok AI.<\/p>\n<p class=\"dcr-130mj7b\">Madelaine Thomas, a sex worker and founder of tech forensics company <a href=\"https:\/\/imageangel.co.uk\/\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">Image Angel<\/a>, who has waived her right to anonymity, said it was \u201ca very emotional day\u201d for her and other victims. However, she said the law falls short of protecting sex workers from intimate image abuse.<\/p>\n<p class=\"dcr-130mj7b\">\u201cWhen commercial sexual images are misused, they\u2019re only seen as a copyright breach. I respect that,\u201d Thomas said. \u201cHowever, the proportion of available responses doesn\u2019t match the harm that occurs when you experience it. By discounting commercialised intimate image abuse, you are not giving people who are going through absolute hell the opportunity to get the help they need.\u201d<\/p>\n<p class=\"dcr-130mj7b\">For the last seven years, intimate images of her have been shared without her consent almost every day. \u201cWhen I first found out that my intimate images were shared, I felt suicidal, frankly, and it took a long time to recover from that.\u201d<\/p>\n<p class=\"dcr-130mj7b\">One in three women in the UK have experienced online abuse, according to domestic abuse organisation <a href=\"https:\/\/refuge.org.uk\/wp-content\/uploads\/2022\/04\/unsocial-spaces-.pdf\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">Refuge<\/a>.<\/p>\n<p class=\"dcr-130mj7b\">Stop Image-Based Abuse is a movement composed of the End Violence Against <a href=\"https:\/\/www.theguardian.com\/society\/women\" data-link-name=\"in body link\" data-component=\"auto-linked-tag\" rel=\"nofollow noopener\" target=\"_blank\">Women<\/a> Coalition, the victim campaign group #NotYourPorn, Glamour UK and Clare McGlynn, a professor of law at Durham University.<\/p>\n<p class=\"dcr-130mj7b\">A Ministry of Justice spokesperson said: \u201cWeaponising technology to target and exploit people is completely abhorrent. It\u2019s already illegal to share intimate deepfakes \u2013 and as of yesterday, creating them is a criminal offence too.<\/p>\n<p class=\"dcr-130mj7b\">\u201cBut we\u2019re not stopping there. We\u2019re going after the companies behind these \u2018nudification\u2019 apps, banning them outright so we can stop this abuse at source.<\/p>\n<p class=\"dcr-130mj7b\">\u201cThe technology secretary has also confirmed that creating non-consensual sexual deepfakes will be made a priority offence under the Online Safety Act, placing extra duties on platforms to proactively prevent this content from appearing.\u201d<\/p>\n","protected":false},"excerpt":{"rendered":"Victims of deepfake image abuse have called for stronger protection against AI-generated explicit images, as the law criminalising&hellip;\n","protected":false},"author":2,"featured_media":460826,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[62,276,277,49,48,61],"class_list":{"0":"post-460825","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-ca","12":"tag-canada","13":"tag-technology"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/posts\/460825","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/comments?post=460825"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/posts\/460825\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/media\/460826"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/media?parent=460825"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/categories?post=460825"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ca\/wp-json\/wp\/v2\/tags?post=460825"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}