{"id":248467,"date":"2026-04-02T13:53:39","date_gmt":"2026-04-02T13:53:39","guid":{"rendered":"https:\/\/www.newsbeep.com\/us-ca\/248467\/"},"modified":"2026-04-02T13:53:39","modified_gmt":"2026-04-02T13:53:39","slug":"a-question-of-human-dignity-biases-in-ai-algorithms-present-unique-harm-to-minorities-campus","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/us-ca\/248467\/","title":{"rendered":"\u2018A question of human dignity\u2019: Biases in AI algorithms present unique harm to minorities | Campus"},"content":{"rendered":"<p>Berkeley Law J.S.D. candidate Mahwish Moazzam wasn\u2019t that interested when the first ad for an AI-generated headshot app appeared on her feed. But when ad after ad kept coming, she finally caved into her curiosity, uploading a casual selfie. At first, she was pleasantly surprised by how professional it looked \u2014 the app even put her in a blazer.<\/p>\n<p>But there was one thing missing in the headshot: her hijab. And over the course of the next few months, it happened over and over again on different apps, until she had a total of 25 headshots without her hijab.<\/p>\n<p>\u201cNobody was talking about it,\u201d Moazzam said. \u201cBut when we look into these (apps), these tools are not merely making mistakes, they are reshaping how people are represented in digital spaces. And when that reshaping repeatedly (erases) visible religious markers of identity, the issue is no longer technical. \u2026 It is discrimination, it is exclusion and it is a question of human dignity.\u201d<\/p>\n<p>Electrical engineering and computer sciences professor Emma Pierson said bias in AI systems that disproportionately cause harm for people of color is not a new thing.<\/p>\n<p>In her research focusing on AI discrimination in healthcare, Pierson revealed that the databases that algorithms are trained on tend to be predominantly European. Therefore, during diagnosis or risk predictions for particular diseases, these systems tend to provide incorrect results for minorities.<\/p>\n<p>\u201cIf you\u2019ve got an algorithm that\u2019s messed up, that can be messing up tens of millions of people\u2019s lives,\u201d Pierson said.<\/p>\n<p>To produce more equitable outcomes, Moazzam points out that both AI databases and the teams building the AI need to reflect the diversity of the people who use it.<\/p>\n<p>Beyond that, Moazzam understands that it\u2019s hard to pinpoint accountability because flaws within these complex systems are not created by any one person.<\/p>\n<p>\u201cFor me, there\u2019s the chain of accountability, starting from training data to getting data in the market,\u201d Moazzam said. \u201cSo (these AI companies) cannot say, \u2018No, (the AI) was free, and it was autonomous. So we are not responsible.\u2019 But no, you are responsible.\u201d<\/p>\n<p>According to Moazzam, there are already several laws that focus on protecting consumers from algorithmic discrimination. A significant example is California\u2019s AB 316, which prevents developers from asserting that \u201cthe artificial intelligence autonomously caused the harm to the plaintiff.\u201d<\/p>\n<p>There is yet to be a comprehensive federal law on AI bias and discrimination.<\/p>\n<p>Pierson said she has hope for these algorithms as she works on implementing changes to make AI more impartial.<\/p>\n<p>\u201cI would say the more optimistic flip side of this means that you can correct this on a large scale, right?\u201d Pierson said. \u201cIt\u2019s very difficult to sort of, you know, go in and rewire millions of human doctors. It\u2019s easier to fix AI\u2019s big decisions and skills if you know what\u2019s going wrong.\u201d<\/p>\n","protected":false},"excerpt":{"rendered":"Berkeley Law J.S.D. candidate Mahwish Moazzam wasn\u2019t that interested when the first ad for an AI-generated headshot app&hellip;\n","protected":false},"author":2,"featured_media":248468,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[34],"tags":[110875,9419,48514,110874,110873,143,145,144],"class_list":{"0":"post-248467","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-oakland","8":"tag-ab316","9":"tag-berkeley-law","10":"tag-eecs","11":"tag-emma-pierson","12":"tag-mahwish-moazzam","13":"tag-oakland","14":"tag-oakland-headlines","15":"tag-oakland-news"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/us-ca\/wp-json\/wp\/v2\/posts\/248467","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/us-ca\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/us-ca\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/us-ca\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/us-ca\/wp-json\/wp\/v2\/comments?post=248467"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/us-ca\/wp-json\/wp\/v2\/posts\/248467\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/us-ca\/wp-json\/wp\/v2\/media\/248468"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/us-ca\/wp-json\/wp\/v2\/media?parent=248467"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/us-ca\/wp-json\/wp\/v2\/categories?post=248467"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/us-ca\/wp-json\/wp\/v2\/tags?post=248467"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}