{"id":258526,"date":"2026-01-30T01:19:10","date_gmt":"2026-01-30T01:19:10","guid":{"rendered":"https:\/\/www.newsbeep.com\/nz\/258526\/"},"modified":"2026-01-30T01:19:10","modified_gmt":"2026-01-30T01:19:10","slug":"amazon-discovered-a-high-volume-of-csam-in-its-ai-training-data-but-isnt-saying-where-it-came-from","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/nz\/258526\/","title":{"rendered":"Amazon discovered a &#8216;high volume&#8217; of CSAM in its AI training data but isn&#8217;t saying where it came from"},"content":{"rendered":"<p class=\"col-body mb-4 leading-7 text-[18px] md:leading-8 break-words min-w-0 charcoal-color\">The National Center for Missing and Exploited Children said it received more than 1 million reports of AI-related child sexual abuse material (CSAM) in 2025. The &#8220;vast majority&#8221; of that content was reported by Amazon, which found the material in its training data, according to an investigation by <a class=\"link \" href=\"https:\/\/www.bloomberg.com\/news\/features\/2026-01-29\/amazon-found-child-sex-abuse-in-ai-training-data\" data-i13n=\"cpos:1;pos:1\" rel=\"nofollow noopener\" target=\"_blank\" data-ylk=\"slk:Bloomberg;cpos:1;pos:1;elm:context_link;itc:0;sec:content-canvas\" data-yga=\"{&quot;yLinkText&quot;:&quot;Bloomberg&quot;,&quot;yLinkPosition&quot;:&quot;1&quot;,&quot;yPosition&quot;:&quot;1&quot;,&quot;yLinkElement&quot;:&quot;context_link&quot;,&quot;yModuleName&quot;:&quot;content-canvas&quot;,&quot;yHasCommerce&quot;:false}\">Bloomberg<\/a>. In addition, Amazon said only that it obtained the inappropriate content from external sources used to train its AI services and claimed it could not provide any further details about where the CSAM came from.<\/p>\n<p class=\"col-body mb-4 leading-7 text-[18px] md:leading-8 break-words min-w-0 charcoal-color\">&#8220;This is really an outlier,&#8221; Fallon McNulty, executive director of NCMEC\u2019s CyberTipline, told Bloomberg. The CyberTipline is where many types of US-based companies are legally required to report suspected CSAM. \u201cHaving such a high volume come in throughout the year begs a lot of questions about where the data is coming from, and what safeguards have been put in place.\u201d She added that aside from Amazon, the AI-related reports the organization received from other companies last year included actionable data that it could pass along to law enforcement for next steps. Since Amazon isn\u2019t disclosing sources, McNulty said its reports have proved \u201cinactionable.\u201d<\/p>\n<p class=\"col-body mb-4 leading-7 text-[18px] md:leading-8 break-words min-w-0 charcoal-color\">&#8220;We take a deliberately cautious approach to scanning foundation model training data, including data from the public web, to identify and remove known [child sexual abuse material] and protect our customers,&#8221; an Amazon representative said in a statement to Bloomberg. The spokesperson also said that Amazon aimed to over-report its figures to NCMEC in order to avoid missing any cases. The company said that it removed the suspected CSAM content before feeding training data into its AI models.<\/p>\n<p class=\"col-body mb-4 leading-7 text-[18px] md:leading-8 break-words min-w-0 charcoal-color\">Safety questions for minors have emerged as a critical concern for the artificial intelligence industry in recent months. CSAM has skyrocketed in NCMEC&#8217;s records; compared with the more than 1 million AI-related reports the organization received last year, the 2024 total was 67,000 reports while 2023 only saw 4,700 reports.<\/p>\n<p class=\"col-body mb-4 leading-7 text-[18px] md:leading-8 break-words min-w-0 charcoal-color\">In addition to issues such as abusive content being used to train models, AI chatbots have also been implicated in several dangerous or tragic cases involving young users. <a class=\"link \" href=\"https:\/\/www.engadget.com\/ai\/the-first-known-ai-wrongful-death-lawsuit-accuses-openai-of-enabling-a-teens-suicide-212058548.html\" data-i13n=\"cpos:2;pos:1\" data-ylk=\"slk:OpenAI;cpos:2;pos:1;elm:context_link;itc:0;sec:content-canvas\" data-yga=\"{&quot;yLinkText&quot;:&quot;OpenAI&quot;,&quot;yLinkPosition&quot;:&quot;2&quot;,&quot;yPosition&quot;:&quot;1&quot;,&quot;yLinkElement&quot;:&quot;context_link&quot;,&quot;yModuleName&quot;:&quot;content-canvas&quot;,&quot;yHasCommerce&quot;:false}\" rel=\"nofollow noopener\" target=\"_blank\">OpenAI<\/a> and <a class=\"link \" href=\"https:\/\/www.engadget.com\/ai\/another-lawsuit-blames-an-ai-company-of-complicity-in-a-teenagers-suicide-184529475.html\" data-i13n=\"cpos:3;pos:1\" data-ylk=\"slk:Character.AI;cpos:3;pos:1;elm:context_link;itc:0;sec:content-canvas\" data-yga=\"{&quot;yLinkText&quot;:&quot;Character.AI&quot;,&quot;yLinkPosition&quot;:&quot;3&quot;,&quot;yPosition&quot;:&quot;1&quot;,&quot;yLinkElement&quot;:&quot;context_link&quot;,&quot;yModuleName&quot;:&quot;content-canvas&quot;,&quot;yHasCommerce&quot;:false}\" rel=\"nofollow noopener\" target=\"_blank\">Character.AI<\/a> have both been sued after teenagers planned their suicides with those companies&#8217; platforms. <a class=\"link \" href=\"https:\/\/www.engadget.com\/social-media\/meta-is-temporarily-pulling-teens-access-from-its-ai-chatbot-characters-180626052.html\" data-i13n=\"cpos:4;pos:1\" data-ylk=\"slk:Meta;cpos:4;pos:1;elm:context_link;itc:0;sec:content-canvas\" data-yga=\"{&quot;yLinkText&quot;:&quot;Meta&quot;,&quot;yLinkPosition&quot;:&quot;4&quot;,&quot;yPosition&quot;:&quot;1&quot;,&quot;yLinkElement&quot;:&quot;context_link&quot;,&quot;yModuleName&quot;:&quot;content-canvas&quot;,&quot;yHasCommerce&quot;:false}\" rel=\"nofollow noopener\" target=\"_blank\">Meta<\/a> is also being sued for alleged failures to protect teen users from sexually explicit conversations with chatbots.<\/p>\n","protected":false},"excerpt":{"rendered":"The National Center for Missing and Exploited Children said it received more than 1 million reports of AI-related&hellip;\n","protected":false},"author":2,"featured_media":258527,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[148762,365,414,363,364,902,39938,148761,148765,148763,111,139,69,145,148764,20168],"class_list":{"0":"post-258526","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-actionable-data","9":"tag-ai","10":"tag-amazon","11":"tag-artificial-intelligence","12":"tag-artificialintelligence","13":"tag-bloomberg","14":"tag-child-sexual-abuse","15":"tag-csam","16":"tag-cybertipline","17":"tag-ncmec","18":"tag-new-zealand","19":"tag-newzealand","20":"tag-nz","21":"tag-technology","22":"tag-the-national-center-for-missing-and-exploited-children","23":"tag-training-data"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/posts\/258526","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/comments?post=258526"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/posts\/258526\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/media\/258527"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/media?parent=258526"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/categories?post=258526"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/nz\/wp-json\/wp\/v2\/tags?post=258526"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}