{"id":150896,"date":"2025-11-20T22:18:20","date_gmt":"2025-11-20T22:18:20","guid":{"rendered":"https:\/\/www.newsbeep.com\/ie\/150896\/"},"modified":"2025-11-20T22:18:20","modified_gmt":"2025-11-20T22:18:20","slug":"trumps-anti-woke-ai-policy-puts-patients-lives-at-risk","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/ie\/150896\/","title":{"rendered":"Trump\u2019s Anti-Woke AI Policy Puts Patients\u2019 Lives at Risk"},"content":{"rendered":"<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color min-h-[6.375rem] lg:min-h-[4.75rem] dropcap text-left\" data-testid=\"paragraph-content\">On July 23, President Donald Trump signed a sweeping <a href=\"https:\/\/www.whitehouse.gov\/presidential-actions\/2025\/07\/preventing-woke-ai-in-the-federal-government\/\" rel=\"nofollow noopener\" target=\"_blank\">executive order<\/a> titled \u201cPreventing Woke AI in the Federal Government Act.\u201d It\u2019s yet another volley in the ongoing political culture war, and a deliberate attempt to <a href=\"https:\/\/time.com\/7210039\/what-is-dei-trump-executive-order-companies-diversity-efforts\/\" rel=\"nofollow noopener\" target=\"_blank\">erase<\/a> terms like diversity, equity, inclusion (DEI) and roll back the work of those addressing systemic racism in federal artificial intelligence systems.<\/p>\n<p class=\"rich-text self-baseline font-graphik text-body-large text-black-coffee mb-0 focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">But for those of us like myself in medicine, especially those advocating for health equity, this isn\u2019t just political posturing. This order threatens lives. It jeopardizes years of work to identify and correct structural biases that have long harmed marginalized communities, particularly Black Americans.<\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">AI is <a href=\"https:\/\/time.com\/7331890\/ai-doctor-emergency-room-care\/\" rel=\"nofollow noopener\" target=\"_blank\">transforming healthcare<\/a>. It\u2019s already being used to triage emergency room patients, prioritize follow-up care, and predict disease risk. But these algorithms don\u2019t arise from neutral ground. They are trained on real-world data. Data that is anything but unbiased.<\/p>\n<p>Protecting medical accuracy<\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">One of the most striking examples came in a 2019 <a href=\"https:\/\/www.science.org\/doi\/10.1126\/science.aax2342\" rel=\"nofollow noopener\" target=\"_blank\">study<\/a> published in Science written by researchers from UC Berkeley and the University of Chicago. They examined a widely used commercial healthcare algorithm designed to flag patients for high-risk care management. On the surface, it appeared objective and data-driven. But researchers discovered that the algorithm wasn\u2019t assessing clinical need at all. Instead, it was quietly using a proxy: the amount of money previously spent on a patient\u2019s care.<\/p>\n<p class=\"rich-text self-baseline font-graphik text-body-large text-black-coffee mb-0 focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">Because Black patients typically receive less care, even when presenting with the same symptoms, that spending proxy led the algorithm to drastically underestimate their need. While nearly 46.5% of Black patients should have been flagged for additional care, the algorithm identified only 17.7%. That\u2019s not a statistical footnote. That\u2019s a system that has been taught to look the other way<br \/>This isn\u2019t an isolated case. Consider two other race-adjusted algorithms still used today:<\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">Kidney function, which is calculated using Glomerular Filtration Rate (GFR) equations, have long included a \u201ccorrection factor\u201d for Black patients, based on unscientific assumptions about muscle mass. Researchers have repeatedly found that this adjustment inflated kidney scores, meaning many Black patients were deemed ineligible for transplants or delayed in receiving specialty care.<\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">And Pulmonary Function Tests (PFTs), used to diagnose asthma and lung diseases, often apply a race-based correction that assumes Black people naturally have lower lung capacity, lowering detection thresholds and contributing to underdiagnosis.<\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">These aren\u2019t just historical artifacts. They are examples of how racism can become embedded in code. Quietly, pervasively, and lethally.<\/p>\n<p class=\"rich-text self-baseline font-graphik text-body-large text-black-coffee mb-0 focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">In recent years, clinicians and researchers like myself have pushed back. Many hospitals are removing race-based corrections from medical equations. Equity-centered AI tools are being developed to detect and mitigate disparities, not ignore them. This work isn\u2019t about being \u201cwoke.\u201d It\u2019s about being accurate, improving outcomes, and saving lives.<\/p>\n<p>The danger of Trump\u2019s anti-woke culture war<\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">Trump\u2019s executive order threatens to shut down the important work that has been done to make medical algorithms more accurate.<\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">By banning federal agencies from considering systemic racism or equity in AI development, the order effectively outlaws the very efforts needed to fix these problems. It silences the data scientists trying to build and foster a fairer system. It tells us that naming inequality is worse than perpetuating it.<\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">Supporters of the order claim it promotes \u201cneutrality.\u201d But neutrality, in a system built on inequity, is not justice. It\u2019s reinforcement of the very biases it pretends to ignore.<\/p>\n<p class=\"rich-text self-baseline font-graphik text-body-large text-black-coffee mb-0 focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">The danger isn\u2019t hypothetical. Black patients are already less likely to be offered <a href=\"https:\/\/pubmed.ncbi.nlm.nih.gov\/17534011\/\" rel=\"nofollow noopener\" target=\"_blank\">pain medication<\/a>, more likely to be <a href=\"https:\/\/projects.iq.harvard.edu\/files\/isl\/files\/are_the_misdiagnoses_in_the_healthcare_system_linked_to_systemic_racism_1.pdf\" rel=\"nofollow noopener\" target=\"_blank\">misdiagnosed<\/a>, and more likely to <a href=\"https:\/\/www.ncbi.nlm.nih.gov\/books\/NBK425844\/\" rel=\"nofollow noopener\" target=\"_blank\">die<\/a> from preventable conditions. Ethically designed AI could help surface these disparities earlier. But only if we\u2019re allowed to build it that way.<\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">And bias in AI doesn\u2019t just harm Black communities. <a href=\"https:\/\/proceedings.mlr.press\/v81\/buolamwini18a\/buolamwini18a.pdf\" rel=\"nofollow noopener\" target=\"_blank\">Studies<\/a> have <a href=\"https:\/\/pmc.ncbi.nlm.nih.gov\/articles\/PMC5621718\/?utm_source=chatgpt.com\" rel=\"nofollow noopener\" target=\"_blank\">shown<\/a> facial recognition systems misidentify women and people of color at far higher rates than white men. In one case, an algorithm used in hiring systematically downgraded r\u00e9sum\u00e9s from women. In another, a healthcare tool underestimated the risk of heart disease in women because historical data underdiagnosed them in the first place. This is how inequality replicates. Biased inputs becoming automated decisions without scrutiny or context.<\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">Erasing DEI from AI isn&#8217;t about neutrality. It&#8217;s about selective memory. It\u2019s an attempt to strip away the language we need to diagnose the problem, let alone fix it. If we force AI to ignore history, it will rewrite it. Not just the facts, but the people those facts represent.<\/p>\n<p class=\"rich-text self-baseline font-graphik text-body-large text-black-coffee mb-0 focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">Trump\u2019s executive order politicizes and weaponizes AI. And for millions of Americans already unseen by our legal, medical, and technological systems, the cost will be measured in lives.<\/p>\n","protected":false},"excerpt":{"rendered":"On July 23, President Donald Trump signed a sweeping executive order titled \u201cPreventing Woke AI in the Federal&hellip;\n","protected":false},"author":2,"featured_media":150897,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[34],"tags":[103,397,396,61,60],"class_list":{"0":"post-150896","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-healthcare","8":"tag-health","9":"tag-health-care","10":"tag-healthcare","11":"tag-ie","12":"tag-ireland"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/posts\/150896","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/comments?post=150896"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/posts\/150896\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/media\/150897"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/media?parent=150896"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/categories?post=150896"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/tags?post=150896"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}