{"id":123660,"date":"2025-11-05T19:11:07","date_gmt":"2025-11-05T19:11:07","guid":{"rendered":"https:\/\/www.newsbeep.com\/ie\/123660\/"},"modified":"2025-11-05T19:11:07","modified_gmt":"2025-11-05T19:11:07","slug":"sony-has-a-new-benchmark-for-ethical-ai","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/ie\/123660\/","title":{"rendered":"Sony has a new benchmark for ethical AI"},"content":{"rendered":"<p class=\"col-body mb-4 leading-7 text-[18px] md:leading-8 break-words min-w-0 charcoal-color\">Sony AI released a dataset that tests the fairness and bias of AI models. It&#8217;s called the Fair Human-Centric Image Benchmark (FHIBE, pronounced like &#8220;Phoebe&#8221;). The company describes it as the &#8220;first publicly available, globally diverse, consent-based human image dataset for evaluating bias across a wide variety of computer vision tasks.&#8221; In other words, it tests the degree to which today&#8217;s AI models treat people fairly. Spoiler: Sony didn&#8217;t find a single dataset from any company that fully met its benchmarks.<\/p>\n<p class=\"col-body mb-4 leading-7 text-[18px] md:leading-8 break-words min-w-0 charcoal-color\">Sony says FHIBE can address the AI industry&#8217;s ethical and bias challenges. The dataset includes images of nearly 2,000 paid participants from over 80 countries. All of their likenesses were shared with consent \u2014 something that can&#8217;t be said for the common practice of <a data-i13n=\"cpos:1;pos:1\" href=\"https:\/\/www.engadget.com\/amazon-investigating-perplexity-ai-after-accusations-it-scrapes-websites-without-consent-133003374.html\" data-ylk=\"slk:scraping large volumes of web data;cpos:1;pos:1;elm:context_link;itc:0;sec:content-canvas\" class=\"link \" rel=\"nofollow noopener\" target=\"_blank\">scraping large volumes of web data<\/a>. Participants in FHIBE can remove their images at any time. Their photos include annotations noting demographic and physical characteristics, environmental factors and even camera settings.<\/p>\n<p class=\"col-body mb-4 leading-7 text-[18px] md:leading-8 break-words min-w-0 charcoal-color\">The tool &#8220;affirmed previously documented biases&#8221; in today&#8217;s AI models. But Sony says FHIBE can also provide granular diagnoses of factors that led to those biases. One example: Some models had lower accuracy for people using &#8220;she\/her\/hers&#8221; pronouns, and FHIBE highlighted greater hairstyle variability as a previously overlooked factor.<\/p>\n<p class=\"col-body mb-4 leading-7 text-[18px] md:leading-8 break-words min-w-0 charcoal-color\">FHIBE also determined that today&#8217;s AI models reinforced stereotypes when prompted with neutral questions about a subject&#8217;s occupation. The tested models were particularly skewed &#8220;against specific pronoun and ancestry groups,&#8221; describing subjects as sex workers, drug dealers or thieves. And when prompted about what crimes an individual committed, models sometimes produced &#8220;toxic responses at higher rates for individuals of African or Asian ancestry, those with darker skin tones and those identifying as &#8216;he\/him\/his.'&#8221;<\/p>\n<p class=\"col-body mb-4 leading-7 text-[18px] md:leading-8 break-words min-w-0 charcoal-color\">Sony AI says FHIBE proves that ethical, diverse and fair data collection is possible. The tool is now <a data-i13n=\"elm:context_link;elmt:doNotAffiliate;cpos:2;pos:1\" class=\"link \" href=\"https:\/\/fairnessbenchmark.ai.sony\/\" rel=\"nofollow noopener\" target=\"_blank\" data-ylk=\"slk:available to the public;elm:context_link;elmt:doNotAffiliate;cpos:2;pos:1;itc:0;sec:content-canvas\">available to the public<\/a>, and it will be updated over time. A paper outlining the research was <a data-i13n=\"elm:context_link;elmt:doNotAffiliate;cpos:3;pos:1\" class=\"link \" href=\"https:\/\/www.nature.com\/articles\/s41586-025-09716-2\" rel=\"nofollow noopener\" target=\"_blank\" data-ylk=\"slk:published;elm:context_link;elmt:doNotAffiliate;cpos:3;pos:1;itc:0;sec:content-canvas\">published<\/a> in Nature on Wednesday.<\/p>\n<p class=\"col-body mb-4 leading-7 text-[18px] md:leading-8 break-words min-w-0 charcoal-color\">Update, November 5, 2025, 2:01 PM ET: This story has been updated to clarify that the participants were paid, not volunteers.<\/p>\n","protected":false},"excerpt":{"rendered":"Sony AI released a dataset that tests the fairness and bias of AI models. It&#8217;s called the Fair&hellip;\n","protected":false},"author":2,"featured_media":123661,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[220,218,219,61,60,2845,80],"class_list":{"0":"post-123660","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-ie","12":"tag-ireland","13":"tag-sony","14":"tag-technology"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/posts\/123660","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/comments?post=123660"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/posts\/123660\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/media\/123661"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/media?parent=123660"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/categories?post=123660"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/tags?post=123660"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}