{"id":252699,"date":"2025-10-26T13:01:08","date_gmt":"2025-10-26T13:01:08","guid":{"rendered":"https:\/\/www.newsbeep.com\/us\/252699\/"},"modified":"2025-10-26T13:01:08","modified_gmt":"2025-10-26T13:01:08","slug":"the-next-legal-frontier-is-your-face-and-ai","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/us\/252699\/","title":{"rendered":"The next legal frontier is your face and AI"},"content":{"rendered":"<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">This is <a href=\"https:\/\/www.theverge.com\/the-stepback-newsletter\" rel=\"nofollow noopener\" target=\"_blank\">The Stepback<\/a>, a weekly newsletter breaking down one essential story from the tech world. For more on the legal morass of AI, follow <a href=\"https:\/\/www.theverge.com\/authors\/adi-robertson\" rel=\"nofollow noopener\" target=\"_blank\">Adi Robertson<\/a>. The Stepback arrives in our subscribers\u2019 inboxes at 8AM ET. Opt in for The Stepback <a href=\"https:\/\/www.theverge.com\/newsletters\" rel=\"nofollow noopener\" target=\"_blank\">here<\/a>.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">The song was called \u201cHeart on My Sleeve,\u201d and if you didn\u2019t know better, you might guess you were hearing Drake. If you did know better, you were hearing the starting bell of a new legal and cultural battle: the fight over how AI services should be able to use people\u2019s faces and voices, and how platforms should respond.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">Back in 2023, the AI-generated faux-Drake track \u201cHeart on My Sleeve\u201d was a novelty; even so, the problems it presented were clear. The song\u2019s close imitation of a major artist rattled musicians. Streaming services removed it on a copyright legal technicality. But the creator wasn\u2019t making a direct copy of anything \u2014 just a very close imitation. So attention quickly turned to the <a href=\"https:\/\/www.theverge.com\/2023\/9\/21\/23836337\/music-generative-ai-voice-likeness-regulation\" rel=\"nofollow noopener\" target=\"_blank\">separate area of likeness law<\/a>. It\u2019s a field that was once synonymous with celebrities going after unauthorized endorsements and parodies, and as audio and video deepfakes proliferated, it felt like one of the few tools available to regulate them.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">Unlike copyright, which is governed by the Digital Millennium Copyright Act and multiple international treaties, there\u2019s no federal law around likeness. It\u2019s a patchwork of varying state laws, none of which were originally designed with AI in mind. But the past few years have seen a flurry of efforts to change that. In 2024, <a href=\"https:\/\/www.nytimes.com\/2024\/03\/21\/us\/politics\/tennessee-ai-music-law.html\" rel=\"nofollow noopener\" target=\"_blank\">Tennessee Gov. Bill Lee<\/a> and <a href=\"https:\/\/www.theverge.com\/2024\/9\/17\/24247583\/california-governor-newsom-signs-ai-digital-replica-bills\" rel=\"nofollow noopener\" target=\"_blank\">California Gov. Gavin Newsom<\/a> \u2014 both of whose states rely heavily on their media industries \u2014 signed bills that expanded protections against unauthorized replicas of entertainers.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">But law has predictably moved more slowly than tech. Last month OpenAI launched Sora, an AI video generation platform aimed specifically at capturing and remixing real people\u2019s likenesses. It opened the floodgates to a torrent of often startlingly realistic deepfakes, including of people who didn\u2019t consent to their creation. OpenAI and other companies are responding by implementing their own likeness policies \u2014 which, in the absence of anything else, could turn into the internet\u2019s new rules of the road.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">OpenAI has <a href=\"https:\/\/www.theverge.com\/ai-artificial-intelligence\/795171\/openai-devday-sam-altman-sora-launch-copyright\" rel=\"nofollow noopener\" target=\"_blank\">denied it was reckless<\/a> launching Sora, with CEO Sam Altman claiming that if anything, it was \u201cway too restrictive\u201d with guardrails. Yet the service has still generated plenty of complaints. It launched with minimal restrictions on the likenesses of historical figures, only to <a href=\"https:\/\/x.com\/OpenAINewsroom\/status\/1979005850166648933\" rel=\"nofollow\">reverse course<\/a> after Martin Luther King Jr.\u2019s estate complained about \u201cdisrespectful depictions\u201d of the assassinated civil rights leader <a href=\"https:\/\/www.npr.org\/2025\/10\/17\/nx-s1-5577869\/sora-block-videos-mlk\" rel=\"nofollow noopener\" target=\"_blank\">spewing racism or committing crimes<\/a>. It touted careful restrictions on unauthorized use of living people\u2019s likenesses, but users found ways around it to put celebrities like Bryan Cranston into Sora videos doing things like <a href=\"https:\/\/www.latimes.com\/entertainment-arts\/business\/story\/2025-10-11\/hollywood-ai-battle-heats-up-sora2-openai-sam-altman\" rel=\"nofollow noopener\" target=\"_blank\">taking a selfie<\/a> with Michael Jackson, leading to <a href=\"https:\/\/www.theverge.com\/news\/803141\/openai-sora-bryan-cranston-deepfakes\" rel=\"nofollow noopener\" target=\"_blank\">complaints from SAG-AFTRA<\/a> that pushed OpenAI to strengthen guardrails in unspecified ways there too.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">Even some people who did authorize Sora cameos (its word for a video using a person\u2019s likeness) were unsettled by the results, including, for women, <a href=\"https:\/\/www.businessinsider.com\/sora-video-openai-fetish-content-my-face-problem-2025-10\" rel=\"nofollow noopener\" target=\"_blank\">all kinds of fetish output<\/a>. Altman said he hadn\u2019t realized people might have <a href=\"https:\/\/www.theverge.com\/ai-artificial-intelligence\/795171\/openai-devday-sam-altman-sora-launch-copyright\" rel=\"nofollow noopener\" target=\"_blank\">\u201cin-between\u201d feelings<\/a> about authorized likenesses, like not wanting a public cameo \u201cto say offensive things or things that they find deeply problematic.\u201d<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">Sora\u2019s been addressing problems with changes like its tweak to the historical figures policy, but it\u2019s not the only AI video service, and things are getting \u2014 in general \u2014 very weird. AI slop has become de rigueur for President Donald Trump\u2019s administration and some other politicians, including gross or outright racist depictions of specific political enemies: Trump responded to last week\u2019s No Kings protests with a video that showed him dropping shit on a person who <a href=\"https:\/\/thehill.com\/homenews\/administration\/5567814-harry-sisson-trump-ai-video\/\" rel=\"nofollow noopener\" target=\"_blank\">resembled liberal influencer Harry Sisson<\/a>, while New York City mayoral candidate <a href=\"https:\/\/www.theguardian.com\/us-news\/2025\/oct\/23\/cuomo-zohran-mamdani-ai-ad\" rel=\"nofollow noopener\" target=\"_blank\">Andrew Cuomo posted<\/a> (and quickly deleted) a \u201ccriminals for Zohran Mamdani\u201d video that showed his Democratic opponent gobbling handfuls of rice. As <a href=\"https:\/\/spitfirenews.com\/p\/ai-generated-influencer-scandals-are-here\" rel=\"nofollow noopener\" target=\"_blank\">Kat Tenbarge chronicled in Spitfire News<\/a> earlier this month, AI videos are becoming ammunition in influencer drama as well.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">There\u2019s an almost constant potential threat of legal action around unauthorized videos, as celebrities like <a href=\"https:\/\/www.theverge.com\/2024\/5\/22\/24162429\/scarlett-johansson-openai-legal-right-to-publicity-likeness-midler-lawyers\" rel=\"nofollow noopener\" target=\"_blank\">Scarlett Johansson have lawyered up<\/a> over use of their likeness. But unlike with AI copyright infringement allegations, which have generated <a href=\"https:\/\/www.theverge.com\/analysis\/694657\/ai-copyright-rulings-anthropic-meta\" rel=\"nofollow noopener\" target=\"_blank\">numerous high-profile lawsuits<\/a> and <a href=\"https:\/\/www.theverge.com\/news\/602096\/copyright-office-says-ai-prompting-doesnt-deserve-copyright-protection\" rel=\"nofollow noopener\" target=\"_blank\">nearly constant deliberation<\/a> inside regulatory agencies, few likeness incidents have escalated to that level \u2014 perhaps in part because the legal landscape is still in flux.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">When SAG-AFTRA thanked OpenAI for changing Sora\u2019s guardrails, it used the opportunity to promote the Nurture Originals, Foster Art, and Keep Entertainment Safe (NO FAKES) Act, <a href=\"https:\/\/www.theverge.com\/news\/645942\/youtube-is-supporting-the-no-fakes-act-targeting-unauthorized-ai-replicas\" rel=\"nofollow noopener\" target=\"_blank\">a years-old attempt to codify protections<\/a> against \u201cunauthorized digital replicas.\u201d The <a href=\"https:\/\/www.congress.gov\/bill\/119th-congress\/senate-bill\/1367\" rel=\"nofollow noopener\" target=\"_blank\">NO FAKES Act<\/a>, which has also garnered support from YouTube, introduces nationwide rights to control the use of a \u201ccomputer-generated, highly realistic electronic representation\u201d of a living or dead person\u2019s voice or visual likeness. It includes liability for online services that knowingly allow unauthorized digital replicas, too.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">The NO FAKES Act has generated severe criticism from online free speech groups. <a href=\"https:\/\/www.eff.org\/deeplinks\/2025\/06\/no-fakes-act-has-changed-and-its-so-much-worse\" rel=\"nofollow noopener\" target=\"_blank\">The EFF dubbed it<\/a> a \u201cnew censorship infrastructure\u201d mandate that forces platforms to filter content so broadly it will almost inevitably lead to unintentional takedowns and a \u201checkler\u2019s veto\u201d online. The bill includes carveouts for parody, satire, and commentary that should be allowed even without authorization, but they\u2019ll be \u201ccold comfort for those who cannot afford to litigate the question,\u201d the organization warned.<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">Opponents of the NO FAKES Act can take solace in how little legislation Congress manages to pass these days \u2014 we\u2019re currently living through the <a href=\"https:\/\/www.cnbc.com\/2025\/10\/22\/government-shutdown-trump-democrats.html\" rel=\"nofollow noopener\" target=\"_blank\">second-longest federal government shutdown in history<\/a>, and there\u2019s even a separate push to <a href=\"https:\/\/www.theverge.com\/ai-artificial-intelligence\/712537\/trump-ai-action-plan-white-house-ai-law-moratorium\" rel=\"nofollow noopener\" target=\"_blank\">block state AI regulation<\/a> that could nullify new likeness laws. But pragmatically, likeness rules are still coming. Earlier this week YouTube announced it will let Partner Program creators <a href=\"https:\/\/www.theverge.com\/news\/803818\/youtube-ai-likeness-detection-deepfake\" rel=\"nofollow noopener\" target=\"_blank\">search for unauthorized uploads<\/a> using their likeness and request their removal. The move expands on existing policies that, among other things, let <a href=\"https:\/\/www.theverge.com\/2023\/11\/14\/23959658\/google-youtube-generative-ai-labels-music-copyright\" rel=\"nofollow noopener\" target=\"_blank\">music industry partners take down<\/a> content that \u201cmimics an artist\u2019s unique singing or rapping voice.\u201d<\/p>\n<p class=\"duet--article--dangerously-set-cms-markup duet--article--standard-paragraph _1ymtmqpi _17nnmdy1 _17nnmdy0 _1xwtict1\">And throughout all this, social norms are still evolving. We\u2019re entering a world where you can easily generate a video of almost anyone doing almost anything \u2014 but when should you? In many cases, those expectations remain up for grabs.<\/p>\n<p>Most of this recent conversation is about AI videos of people doing simply weird or silly things, but historically, research indicates the <a href=\"https:\/\/www.securityhero.io\/state-of-deepfakes\/\" rel=\"nofollow noopener\" target=\"_blank\">overwhelming majority of deepfakes<\/a> have been pornographic images of women, often made without consent. Beyond Sora there\u2019s a whole different conversation about things like <a href=\"https:\/\/www.404media.co\/deepfake-tools-spread-on-social-media-research\/\" rel=\"nofollow noopener\" target=\"_blank\">the output of AI nudify services<\/a>, and the <a href=\"https:\/\/www.404media.co\/michigan-us-states-with-deepfakes-laws\/\" rel=\"nofollow noopener\" target=\"_blank\">legal issues<\/a> are similar to those concerning other <a href=\"https:\/\/www.theverge.com\/news\/661230\/trump-signs-take-it-down-act-ai-deepfakes\" rel=\"nofollow noopener\" target=\"_blank\">nonconsensual sexual imagery<\/a>.On top of the basic legal issue of when a likeness is unauthorized, there are also questions like when a video might be defamatory (if it\u2019s sufficiently realistic) or harassing (if it\u2019s part of a larger pattern of stalking and threats), which could make individual situations even more complicated.Social platforms are used to being almost always shielded from liability through Section 230, which says they can\u2019t be treated as the publisher or speaker of third-party content. As more and more services take the active step of helping users generate content, how far Section 230 will shield the resulting images and video seems like a fascinating question.Despite long-standing fears that AI will make it truly impossible to distinguish phantasms from reality, it\u2019s still often simple to use context and \u201ctells\u201d (from specific editing tics to <a href=\"https:\/\/www.theguardian.com\/us-news\/2025\/aug\/07\/chris-cuomo-alexandria-ocasio-cortez-deepfake\" rel=\"nofollow noopener\" target=\"_blank\">obvious watermarks<\/a>) to figure out whether a video was AI-generated. The problem is many people don\u2019t look closely enough or simply don\u2019t care if it\u2019s fake.Sarah Jeong\u2019s <a href=\"https:\/\/www.theverge.com\/2024\/8\/22\/24225972\/ai-photo-era-what-is-reality-google-pixel-9\" rel=\"nofollow noopener\" target=\"_blank\">warning about seamlessly manipulated photographs<\/a> is even more relevant now than it was when she published it in 2024.The New York Times <a href=\"https:\/\/www.nytimes.com\/interactive\/2025\/10\/21\/business\/media\/trump-ai-truth-social-no-kings.html\" rel=\"nofollow noopener\" target=\"_blank\">has a comprehensive look<\/a> at Trump\u2019s particular affinity for AI-generated content.<a href=\"https:\/\/maxread.substack.com\/p\/can-openai-build-a-social-network\" rel=\"nofollow noopener\" target=\"_blank\">Max Read\u2019s analysis<\/a> of Sora as a social platform and whether it will \u201cwork.\u201dFollow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.Adi RobertsonClose<img alt=\"Adi Robertson\" data-chromatic=\"ignore\" loading=\"lazy\" decoding=\"async\" data-nimg=\"fill\" class=\"_1bw37385 x271pn0\" style=\"position:absolute;height:100%;width:100%;left:0;top:0;right:0;bottom:0;color:transparent;background-size:cover;background-position:50% 50%;background-repeat:no-repeat;background-image:url(&quot;data:image\/svg+xml;charset=utf-8,%3Csvg xmlns='http:\/\/www.w3.org\/2000\/svg' %3E%3Cfilter id='b' color-interpolation-filters='sRGB'%3E%3CfeGaussianBlur stdDeviation='20'\/%3E%3CfeColorMatrix values='1 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 100 -1' result='s'\/%3E%3CfeFlood x='0' y='0' width='100%25' height='100%25'\/%3E%3CfeComposite operator='out' in='s'\/%3E%3CfeComposite in2='SourceGraphic'\/%3E%3CfeGaussianBlur stdDeviation='20'\/%3E%3C\/filter%3E%3Cimage width='100%25' height='100%25' x='0' y='0' preserveAspectRatio='none' style='filter: url(%23b);' href='data:image\/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAQAAAC1HAwCAAAAC0lEQVR42mN8+R8AAtcB6oaHtZcAAAAASUVORK5CYII='\/%3E%3C\/svg%3E&quot;)\"   src=\"https:\/\/www.newsbeep.com\/us\/wp-content\/uploads\/2025\/10\/1761483668_820_ADI_ROBERTSON.0.jpg\"\/>Adi Robertson<\/p>\n<p>Senior Editor, Tech &amp; Policy<\/p>\n<p class=\"fv263x1\">Posts from this author will be added to your daily email digest and your homepage feed.<\/p>\n<p>FollowFollow<\/p>\n<p class=\"fv263x4\"><a class=\"fv263x5\" href=\"https:\/\/www.theverge.com\/authors\/adi-robertson\" rel=\"nofollow noopener\" target=\"_blank\">See All by Adi Robertson<\/a><\/p>\n<p>AICloseAI<\/p>\n<p class=\"fv263x1\">Posts from this topic will be added to your daily email digest and your homepage feed.<\/p>\n<p>FollowFollow<\/p>\n<p class=\"fv263x4\"><a class=\"fv263x5\" href=\"https:\/\/www.theverge.com\/ai-artificial-intelligence\" rel=\"nofollow noopener\" target=\"_blank\">See All AI<\/a><\/p>\n<p>ColumnCloseColumn<\/p>\n<p class=\"fv263x1\">Posts from this topic will be added to your daily email digest and your homepage feed.<\/p>\n<p>FollowFollow<\/p>\n<p class=\"fv263x4\"><a class=\"fv263x5\" href=\"https:\/\/www.theverge.com\/column\" rel=\"nofollow noopener\" target=\"_blank\">See All Column<\/a><\/p>\n<p>TechCloseTech<\/p>\n<p class=\"fv263x1\">Posts from this topic will be added to your daily email digest and your homepage feed.<\/p>\n<p>FollowFollow<\/p>\n<p class=\"fv263x4\"><a class=\"fv263x5\" href=\"https:\/\/www.theverge.com\/tech\" rel=\"nofollow noopener\" target=\"_blank\">See All Tech<\/a><\/p>\n<p>The StepbackCloseThe Stepback<\/p>\n<p class=\"fv263x1\">Posts from this topic will be added to your daily email digest and your homepage feed.<\/p>\n<p>FollowFollow<\/p>\n<p class=\"fv263x4\"><a class=\"fv263x5\" href=\"https:\/\/www.theverge.com\/the-stepback-newsletter\" rel=\"nofollow noopener\" target=\"_blank\">See All The Stepback<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"This is The Stepback, a weekly newsletter breaking down one essential story from the tech world. For more&hellip;\n","protected":false},"author":2,"featured_media":252700,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[45],"tags":[182,181,507,38965,172,74,123898],"class_list":{"0":"post-252699","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-column","12":"tag-tech","13":"tag-technology","14":"tag-the-stepback"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/posts\/252699","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/comments?post=252699"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/posts\/252699\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/media\/252700"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/media?parent=252699"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/categories?post=252699"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/tags?post=252699"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}