{"id":201466,"date":"2025-10-04T11:39:11","date_gmt":"2025-10-04T11:39:11","guid":{"rendered":"https:\/\/www.newsbeep.com\/us\/201466\/"},"modified":"2025-10-04T11:39:11","modified_gmt":"2025-10-04T11:39:11","slug":"ai-voices-are-now-indistinguishable-from-real-human-voices","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/us\/201466\/","title":{"rendered":"AI voices are now indistinguishable from real human voices"},"content":{"rendered":"<p id=\"9abdca61-5854-4611-9fe9-4eb1a47dd8f7\">Most of us have likely experienced <a data-analytics-id=\"inline-link\" href=\"https:\/\/www.livescience.com\/technology\/artificial-intelligence\/what-is-artificial-intelligence-ai\" data-before-rewrite-localise=\"https:\/\/www.livescience.com\/technology\/artificial-intelligence\/what-is-artificial-intelligence-ai\" rel=\"nofollow noopener\" target=\"_blank\">artificial intelligence<\/a> (AI) voices through personal assistants like Siri or Alexa, with their flat intonation and mechanical delivery giving us the impression that we could easily distinguish between an AI-generated voice and a real person. But scientists now say the average listener can no longer tell the difference between real people and &#8220;deepfake&#8221; voices.<\/p>\n<p>In a new study published Sept. 24 in the journal <a data-analytics-id=\"inline-link\" href=\"https:\/\/pmc.ncbi.nlm.nih.gov\/articles\/PMC12459763\/#:~:text=We%20find%20that%20voice%20clones,not%20observe%20a%20hyperrealism%20effect.\" target=\"_blank\" data-url=\"https:\/\/pmc.ncbi.nlm.nih.gov\/articles\/PMC12459763\/#:~:text=We%20find%20that%20voice%20clones,not%20observe%20a%20hyperrealism%20effect.\" referrerpolicy=\"no-referrer-when-downgrade\" data-hl-processed=\"none\" rel=\"nofollow noopener\">PLoS One<\/a>, researchers showed that when people listen to human voices \u2014 alongside AI-generated versions of the same voices \u2014 they cannot accurately identify which are real and which are fake.<\/p>\n<p><a id=\"elk-seasonal\" href=\"\" data-url=\"\" target=\"_blank\" referrerpolicy=\"no-referrer-when-downgrade\" data-hl-processed=\"none\"\/><\/p>\n<p id=\"9abdca61-5854-4611-9fe9-4eb1a47dd8f7-2\">&#8220;AI-generated voices are all around us now. We\u2019ve all spoken to Alexa or Siri, or had our calls taken by automated customer service systems,&#8221; said lead author of the study <a data-analytics-id=\"inline-link\" href=\"https:\/\/www.qmul.ac.uk\/sbbs\/staff\/nadine-lavan.html\" data-url=\"https:\/\/www.qmul.ac.uk\/sbbs\/staff\/nadine-lavan.html\" target=\"_blank\" referrerpolicy=\"no-referrer-when-downgrade\" data-hl-processed=\"none\" rel=\"nofollow noopener\">Nadine Lavan<\/a>, senior lecturer in psychology at Queen Mary University of London, in a statement. &#8220;Those things don\u2019t quite sound like real human voices, but it was only a matter of time until AI technology began to produce naturalistic, human-sounding speech.&#8221;<\/p>\n<p>You may like<\/p>\n<p id=\"4323fe54-691f-4c73-b8d3-7854732a237f\">The study suggested that, while generic voices created from scratch were not deemed to be realistic, voice clones trained on the voices of real people \u2014 deepfake audio \u2014 were found to be just as believable as their real-life counterparts.<\/p>\n<p>The scientists gave study participants samples of 80 different voices (40 AI-generated voices and 40 real human voices) and asked them to label which they thought was real and AI-generated. On average, only 41% of the from-scratch AI voices were misclassified as being human, which suggested it is still possible, in most cases, to tell them apart from real people.<\/p>\n<p>However, for AI voices cloned from humans, the majority (58%) of were misclassified as being human. Only slightly more (62%) of the human voices were classified correctly as being human, leading the researchers to conclude that there was no statistical difference in our capacity to tell the voices of real people apart from their deepfake clones.<\/p>\n<p>The results have potentially<a data-analytics-id=\"inline-link\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3630106.3658911\" target=\"_blank\" data-url=\"https:\/\/dl.acm.org\/doi\/10.1145\/3630106.3658911\" referrerpolicy=\"no-referrer-when-downgrade\" data-hl-processed=\"none\" rel=\"nofollow noopener\"> profound implications for ethics, copyright and security<\/a>, Lavan said. Should criminals use AI to clone your voice, it becomes that much easier to bypass voice authentication protocols at the bank or to trick your loved ones into transferring money.<\/p>\n<p class=\"newsletter-form__strapline\">Get the world\u2019s most fascinating discoveries delivered straight to your inbox.<\/p>\n<p>We&#8217;ve already seen several incidents play out. On July 9, for example, Sharon Brightwell<a data-analytics-id=\"inline-link\" href=\"https:\/\/people.com\/woman-conned-out-of-usd15k-after-ai-cloned-daughters-voice-terrifying-scam-11775622\" target=\"_blank\" data-url=\"https:\/\/people.com\/woman-conned-out-of-usd15k-after-ai-cloned-daughters-voice-terrifying-scam-11775622\" referrerpolicy=\"no-referrer-when-downgrade\" data-hl-processed=\"none\" rel=\"nofollow noopener\"> was tricked out of $15,000<\/a>. Brightwell listened to what she thought was her daughter crying down the phone, telling her that she had been in an accident and that she needed money for legal representation to keep her out of jail. &#8220;There is nobody that could convince me that it wasn\u2019t her,&#8221; Brightwell said of the realistic AI fabrication at the time.<\/p>\n<p>Lifelike AI voices can also be used to fabricate statements by, and interviews with, politicians or celebrities. Fake audio might be used to discredit individuals or to incite unrest, sowing social division and conflict.<a data-analytics-id=\"inline-link\" href=\"https:\/\/www.news.com.au\/finance\/work\/leaders\/scammer-uses-ai-voice-clone-of-queensland-premier-steven-miles-to-run-a-bitcoin-investment-con\/news-story\/d1fa46b030794a4461eed0ab08215c10\" target=\"_blank\" data-url=\"https:\/\/www.news.com.au\/finance\/work\/leaders\/scammer-uses-ai-voice-clone-of-queensland-premier-steven-miles-to-run-a-bitcoin-investment-con\/news-story\/d1fa46b030794a4461eed0ab08215c10\" referrerpolicy=\"no-referrer-when-downgrade\" data-hl-processed=\"none\" rel=\"nofollow noopener\"> Con artists recently built an AI clone of the voice of Queensland Premier Steven Miles<\/a>, using his profile to try to get people to invest in a Bitcoin scam, for instance.<\/p>\n<p>The researchers emphasised that the voice clones they used in the study were not even particularly sophisticated. They made them with commercially available software and trained them with as little as four minutes of human speech recordings.<\/p>\n<p id=\"42e879d5-bf4e-42d9-bfa9-da4f445d6890\">&#8220;The process required minimal expertise, only a few minutes of voice recordings, and almost no money,&#8221; Navan said in the statement. &#8220;It just shows how accessible and sophisticated AI voice technology has become.&#8221;<\/p>\n<p>While deepfakes present a multitude of opportunities for malign actors, it isn\u2019t all bad news; there may be more positive opportunities that come with the power to generate AI voices at scale. &#8220;There might be applications for improved accessibility, education, and communication, where bespoke high-quality synthetic voices can enhance user experience,&#8221; Navan said.<\/p>\n","protected":false},"excerpt":{"rendered":"Most of us have likely experienced artificial intelligence (AI) voices through personal assistants like Siri or Alexa, with&hellip;\n","protected":false},"author":2,"featured_media":201467,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[45],"tags":[182,181,507,74],"class_list":{"0":"post-201466","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-technology"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/posts\/201466","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/comments?post=201466"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/posts\/201466\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/media\/201467"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/media?parent=201466"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/categories?post=201466"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/tags?post=201466"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}