{"id":171700,"date":"2025-09-27T01:53:09","date_gmt":"2025-09-27T01:53:09","guid":{"rendered":"https:\/\/www.newsbeep.com\/au\/171700\/"},"modified":"2025-09-27T01:53:09","modified_gmt":"2025-09-27T01:53:09","slug":"big-tech-ignored-bias-in-ai-justice-ai-gpt-says-it-solved-it","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/au\/171700\/","title":{"rendered":"Big Tech Ignored Bias In AI\u2014Justice AI GPT Says It Solved It"},"content":{"rendered":"<p><img decoding=\"async\" class=\" top-image\" src=\"https:\/\/www.newsbeep.com\/au\/wp-content\/uploads\/2025\/09\/1758937989_432_960x0.jpg\" alt=\"Black woman standing with numbers and technology near her head\" data-height=\"2160\" data-width=\"3840\" fetchpriority=\"high\" style=\"position:absolute;top:0\"\/><\/p>\n<p>Justice AI GPT may be the world\u2019s first large language model-agnostic AI framework to solve the bias problem.<\/p>\n<p>getty<\/p>\n<p>Many organizations have been relying on artificial intelligence (AI) to assist with workplace decision-making, using AI tools to assist with <a href=\"https:\/\/www.forbes.com\/sites\/keithferrazzi\/2025\/03\/27\/the-ai-recruitment-takeover-redefining-hiring-in-the-digital-age\/\" data-ga-track=\"InternalLink:https:\/\/www.forbes.com\/sites\/keithferrazzi\/2025\/03\/27\/the-ai-recruitment-takeover-redefining-hiring-in-the-digital-age\/\" target=\"_self\" aria-label=\"recruitment\" rel=\"nofollow noopener\">recruitment<\/a>, <a href=\"https:\/\/careerservices.fas.harvard.edu\/ai-interviews-and-offers\/\" target=\"_blank\" rel=\"nofollow noopener noreferrer\" data-ga-track=\"ExternalLink:https:\/\/careerservices.fas.harvard.edu\/ai-interviews-and-offers\/\" aria-label=\"interview evaluations\">interview evaluations<\/a>, <a href=\"https:\/\/www.bsr.org\/en\/emerging-issues\/ai-in-hiring\" target=\"_blank\" rel=\"nofollow noopener noreferrer\" data-ga-track=\"ExternalLink:https:\/\/www.bsr.org\/en\/emerging-issues\/ai-in-hiring\" aria-label=\"hiring\">hiring<\/a> and even <a href=\"https:\/\/lattice.com\/articles\/using-ai-to-write-performance-reviews-everything-you-need-to-know\" target=\"_blank\" rel=\"nofollow noopener noreferrer\" data-ga-track=\"ExternalLink:https:\/\/lattice.com\/articles\/using-ai-to-write-performance-reviews-everything-you-need-to-know\" aria-label=\"performance evaluations\">performance evaluations<\/a>. With the rise of AI, it has become abundantly clear that AI tools are riddled with <a href=\"https:\/\/www.forbes.com\/sites\/janicegassam\/2025\/06\/23\/what-the-workday-lawsuit-reveals-about-ai-bias-and-how-to-prevent-it\/\" data-ga-track=\"InternalLink:https:\/\/www.forbes.com\/sites\/janicegassam\/2025\/06\/23\/what-the-workday-lawsuit-reveals-about-ai-bias-and-how-to-prevent-it\/\" target=\"_self\" aria-label=\"bias\" rel=\"nofollow noopener\">bias<\/a> that impact their use and effectiveness. Just like humans, AI, without human oversight, will revert to their biases. \u201cTechnology can crunch numbers and generate data, but it\u2019s humans who must interpret it and make the final call,\u201d explained <a href=\"https:\/\/www.linkedin.com\/in\/joshquintero\/\" target=\"_blank\" rel=\"nofollow noopener noreferrer\" data-ga-track=\"ExternalLink:https:\/\/www.linkedin.com\/in\/joshquintero\/\" aria-label=\"Josh Quintero\">Josh Quintero<\/a>, communications manager for the city of Lynchburg, Virginia. \u201cIn local government, we have a responsibility to do this work fairly and transparently because we owe it to the people we serve.\u201d<\/p>\n<p>Human biases are replicated in the AI systems used for workplace decision-making. \u201cWhen AI is used in hiring or performance reviews without proper input or calibration, it doesn\u2019t just carry bias\u2014it rewrites value,\u201d global research consultant for the Consumer Climate Report <a href=\"https:\/\/www.linkedin.com\/in\/avatoro\/\" target=\"_blank\" rel=\"nofollow noopener noreferrer\" data-ga-track=\"ExternalLink:https:\/\/www.linkedin.com\/in\/avatoro\/\" aria-label=\"Ava Toro\">Ava Toro<\/a> wrote in an email. She went on to explain, \u201cA person can contextualize growth at a small business as equal to growth at a Fortune 500, but an uncalibrated system reduces that work to \u2018less than.\u2019 That\u2019s not just bias\u2014it\u2019s a system that structurally misreads talent, disproportionately impacting employees of color and those from certain class backgrounds.\u201d<\/p>\n<p>Rather than discouraging the usage of AI, technologist Christian Ortiz developed a revolutionary tool to address AI bias head-on. \u201cIn late 2022, while beta testing Chat with 3.5, I saw what others missed,\u201d he explained, \u201cBias was not a glitch in AI, it was the design. As a decolonial social scientist and justice advocate, I asked myself, \u2018Where does this bias come from, and what would it take to dismantle it completely?\u2019 That question led me to build Justice AI GPT. I authored the Decolonial Intelligence Algorithmic Framework\u2122 and DIAL, the Decolonial Intelligence for Access and Liberation, and I created the world\u2019s first decolonial dataset. These innovations are my intellectual property, and they became the foundation for my <a href=\"https:\/\/www.justiceaigpt.ca\/\" target=\"_blank\" rel=\"nofollow noopener noreferrer\" data-ga-track=\"ExternalLink:https:\/\/www.justiceaigpt.ca\/\" aria-label=\"Justice AI GPT\">Justice AI GPT<\/a>.\u201d<\/p>\n<p>Decolonial social scientist and technologist Christian Ortiz<\/p>\n<p>Christian Ortiz<\/p>\n<p>Ortiz went on to explain how Justice AI GPT is the world\u2019s \u201cfirst large language model-agnostic AI framework to actually solve the bias problem.\u201d So how exactly does Justice AI GPT fix the AI bias problem? \u201cJustice AI works by spotting and dismantling the biased information that comes from Eurocentric, Western colonial datasets,\u201d Ortiz explained. \u201cUnlike other tools that try to patch the harm after it happens, Justice AI prevents bias at the source. It does this by pairing OpenAI\u2019s massive datasets with my decolonial dataset, which was built in collaboration with more than 560 global experts who contributed over three decades of knowledge each from their communities and professions. The result is the first dataset in the world designed not to replicate colonial patterns, but to actively counter them.\u201d<\/p>\n<p class=\"p1\">Many organizations and institutions lean on AI to help them make quick and efficient workplace decisions and to streamline what can be long and convoluted processes, but the convenience of AI is not without its risks. Ortiz explained, \u201cIn policy and workplace culture, bias often hides behind coded language. Job postings or HR documents that emphasize \u2018cultural fit\u2019 or \u2018strong communication skills\u2019 seem neutral but often reinforce whiteness as the standard. Justice AI identifies those hidden codes and rewrites them in ways that affirm global majority expression, multilingualism, and neurodivergent communication styles. It ensures that employees are evaluated by their contributions, not penalized by unspoken colonial norms.\u201d<\/p>\n<p class=\"p1\">Ortiz shared that currently, 112 organizations have implemented Justice AI GPT into the workplace, using it to assist with things like bias audits on policies and procedures and DEI coaching plans. \u201cInstead of guessing where bias might exist, [organizations] can now pinpoint it with precision and redesign systems in real time,\u201d Ortiz said. \u201cThese organizations also use Justice AI to reshape training modules and leadership development, so equity is no longer treated as an afterthought but as the operating principle. I also have two major corporations using Justice AI across their HR departments. In hiring, the tool prevents qualified candidates from being excluded because of ethnic names, non-Western education pathways, or neurodivergent communication styles. In training and development, Justice AI powers cultural impact programs and bias coaching so workplace culture shifts at a systemic level, not just in isolated workshops.\u201d<\/p>\n<p class=\"p1\">Ortiz shared what he envisions for the future of his innovative tool. \u201cIn the next few years, I want to see Justice AI reach a million users. I want it in the hands of organizations across every sector and adopted by governments around the world. The global landscape is shifting, markets are interconnected, migration is reshaping demographics, and technology has collapsed distance. How we communicate across cultures has never been more important, and it will decide whether we deepen divisions or build solidarity.\u201d<\/p>\n","protected":false},"excerpt":{"rendered":"Justice AI GPT may be the world\u2019s first large language model-agnostic AI framework to solve the bias problem.&hellip;\n","protected":false},"author":2,"featured_media":171701,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[20],"tags":[256,111987,111988,254,255,64,63,10826,5004,111992,111991,5044,111989,111990,105],"class_list":{"0":"post-171700","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-ai-bias","10":"tag-algorithmic-bias","11":"tag-artificial-intelligence","12":"tag-artificialintelligence","13":"tag-au","14":"tag-australia","15":"tag-big-tech","16":"tag-chatgpt","17":"tag-decolonial-datasets","18":"tag-justice-ai-gpt","19":"tag-openai","20":"tag-tech-bias","21":"tag-tech-equity","22":"tag-technology"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts\/171700","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/comments?post=171700"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts\/171700\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/media\/171701"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/media?parent=171700"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/categories?post=171700"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/tags?post=171700"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}