{"id":386215,"date":"2026-01-02T05:41:08","date_gmt":"2026-01-02T05:41:08","guid":{"rendered":"https:\/\/www.newsbeep.com\/au\/386215\/"},"modified":"2026-01-02T05:41:08","modified_gmt":"2026-01-02T05:41:08","slug":"how-ai-is-already-shaping-healthcare-and-what-black-patients-should-know","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/au\/386215\/","title":{"rendered":"How AI Is Already Shaping Healthcare And What Black Patients Should Know"},"content":{"rendered":"<p>\t\t\t\t\t\t\t\t\t\t\t<img fetchpriority=\"high\" decoding=\"async\" src=\"https:\/\/www.newsbeep.com\/au\/wp-content\/uploads\/2026\/01\/GettyImages-1321467199-1200x900.jpg\" alt=\"How AI Is Already Shaping Healthcare And What Black Patients Should Know\" width=\"800\" height=\"600\"\/>\t\t\t\t\t\t\t<\/p>\n<p>\t\t\t\t\t\t\t\tGetty Images\t\t\t\t\t\t\t<\/p>\n<p>\t\t\t\t\t\t\t<img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/www.newsbeep.com\/au\/wp-content\/uploads\/2026\/01\/1767332468_294_.jpeg\" alt=\"\" width=\"78\" height=\"78\"\/><\/p>\n<p>When most of us walk into a doctor\u2019s office, we come prepared with questions. What does this symptom mean? Do I really need this test? What are my treatment options? What we may not be aware of is that there\u2019s often a silent, invisible guest answering those questions right alongside our clinicians: <a href=\"https:\/\/www.essence.com\/news\/essence-fest-2025-the-future-is-now-ai\/\" rel=\"nofollow noopener\" target=\"_blank\">Artificial intelligence<\/a>.<\/p>\n<p>In the past few months alone, the FDA has cleared AI tools that can predict five-year breast cancer risk from routine mammograms, analyze lung sounds during virtual visits, and map the outline of organs on MRIs, through tools like Clairity Breast or Tyto Insights. These, and many others, may already be touching parts of our care without us ever being told they\u2019re there. Increasingly, algorithms are offering risk scores, recommending treatment plans, and flagging which patients need attention first.<\/p>\n<p>But the missing piece is that patients often have no idea when this technology is being used on them. Also, there\u2019s no clear way to opt out. AI in healthcare brings real potential, but also <a href=\"https:\/\/www.essence.com\/news\/mit-book-dangers-artificial-intelligence\/\" rel=\"nofollow noopener\" target=\"_blank\">real risks<\/a>, especially for Black communities. What does informed consent look like in an era when algorithms are shaping our care? <\/p>\n<p>\u201cPatients don\u2019t see a pop-up message that says, \u2018Today\u2019s care was brought to you by this algorithm,\u2019\u201d says Tiffani Bright, PhD, assistant professor at Cedars-Sinai and co-director of its Center for AI Research and Education. \u201cThe algorithmic influence is just there in the background.\u201d<\/p>\n<p>That background use can shape far more than we realize. Algorithms help hospitals decide who needs urgent care, which treatments are recommended, and even which appointment slot you get based on your symptoms. When the technology is invisible, Bright explains, patients lose something essential: Agency.<\/p>\n<p>\u201cIf you don\u2019t know it\u2019s there, you don\u2019t have an option,\u201d she says. \u201cYou can\u2019t say, \u2018I don\u2019t know who made this tool. I don\u2019t know what data they used. I don\u2019t know if it even works for patients like me.\u2019\u201d Bright believes that these tools should have to earn the trust of patients, just like doctors do, and that trust must be earned through transparency and equity.\u00a0<\/p>\n<p>For Black patients who already navigate a <a href=\"https:\/\/www.essence.com\/news\/ai-racist-stereotypes\/\" rel=\"nofollow noopener\" target=\"_blank\">healthcare system shaped by discrimination<\/a>, under-treatment, and misdiagnosis, lack of transparency can be downright harmful. AI systems learn from patterns in existing data, including medical records, imaging, and lab results. But the historical data may already reflect decades of unequal care.<\/p>\n<p>\u201cWe have to ask who is represented in the data set, and who isn\u2019t,\u201d Bright says. \u201cAnything that uses historical data can amplify existing disparities.\u201d She is intentional about applying equity lenses in her work at The Center for AI Research and Education. \u201cWe test AI for language, gender, insurance, and things like that. We want to make sure that patients, and groups of patients, aren\u2019t underrepresented in our records.\u201d<\/p>\n<p>Black women already face some of the most dangerous gaps in American healthcare, from higher rates of maternal mortality to under-treatment for pain to delayed cancer diagnoses. AI has the potential to help close those gaps, but only if equity is intentionally built into the technology and patients are informed participants in how it affects their care.<\/p>\n<p>Professor Antony Haynes is a privacy law attorney and professor at Albany Law School. He notes that AI tools pick up patterns that reflect economic, racial, and cultural differences, even when race isn\u2019t explicitly included. For example, pulse oximeters (the small clips placed on a finger to test oxygen) are less accurate on darker skin, and temperature scanners often used in clinics can under-detect fevers in Black patients.\u00a0<\/p>\n<p>During COVID, an algorithm used by a major insurer prioritized healthier white patients over sicker Black ones, not because race was an input, but because Black patients historically receive less care, and thus appear \u201clower cost.\u201d Another tool miscategorized asthma patients, who are disproportionately Black, as \u201clow risk\u201d based solely on hospital stay lengths. On a policy level, these disparities remain largely unaddressed. \u201cBecause Black patients are not the priority population for industry or regulators, these issues often aren\u2019t corrected,\u201d Haynes says.\u00a0<\/p>\n<p>So, what rights do patients actually have? Legally, this space is murky, but Haynes breaks down a few key points to note: HIPAA, the main federal health privacy law, does not require doctors to disclose when they use AI. Patients generally do not have a federal right to opt out of AI-assisted care, and vendors can often use \u201canonymized\u201d patient data to train AI.<\/p>\n<p>However, some states, like California, give residents the right to opt out of certain automated decisions. The caveat here is that hospitals themselves are usually exempt.<\/p>\n<p>\u201cIn informed consent law, consent is typically required when your data is used for research,\u201d Haynes says. \u201cBut for routine treatment, there\u2019s no requirement at the federal level to inform you about the use of AI.\u201d He believes this needs to change, and urgently.<\/p>\n<p>\u201cYou as a human have the right to a human decision maker,\u201d he says. \u201cYou have the right to know if your doctor relies on software. You have the right to request a human override.\u201d<\/p>\n<p>But even without transparency laws in place, patients still have power. Haynes encourages asking your doctors questions like these:<\/p>\n<p>Are you using AI or software to help diagnose or treat me?<\/p>\n<p>How exactly is it being used?<\/p>\n<p>Are you relying on it, or is it just one tool among others?<\/p>\n<p>If I prefer a human-only decision, can that be done?<\/p>\n<p>\u201cI think you should always ask,\u201d he says. \u201cAt the end of the day, you can seek a second opinion or choose a different provider.\u201d<\/p>\n<p>Bright agrees, adding that Black patients must feel empowered to interrogate the tools, just as you interrogate the system. \u201cDon\u2019t be afraid of the technology, but do be informed. Do ask questions. Do use your voice. We want our tools used with our patients, not on them,\u201d she says. \u201cThat\u2019s the difference between ethical AI and everything else. You have the right to understand, and the right to say no.\u201d<\/p>\n<p>People can also call upon lawmakers to draft federal and state legislation requiring doctors to proactively disclose when AI is being used, vendors to disclose exactly how their algorithms were trained, and patients to have access to information that explains what a tool does and doesn\u2019t do. \u201cPatients shouldn\u2019t have to guess,\u201d Haynes says.\u00a0<\/p>\n<p>Ultimately, no matter how advanced the technology becomes, trust in healthcare still starts with one simple principle: Nothing about us, without us.<\/p>\n","protected":false},"excerpt":{"rendered":"Getty Images When most of us walk into a doctor\u2019s office, we come prepared with questions. What does&hellip;\n","protected":false},"author":2,"featured_media":386216,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[34],"tags":[64,63,137,500],"class_list":{"0":"post-386215","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-healthcare","8":"tag-au","9":"tag-australia","10":"tag-health","11":"tag-healthcare"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts\/386215","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/comments?post=386215"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts\/386215\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/media\/386216"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/media?parent=386215"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/categories?post=386215"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/tags?post=386215"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}