{"id":550867,"date":"2026-04-26T00:29:08","date_gmt":"2026-04-26T00:29:08","guid":{"rendered":"https:\/\/www.newsbeep.com\/uk\/550867\/"},"modified":"2026-04-26T00:29:08","modified_gmt":"2026-04-26T00:29:08","slug":"contributor-ai-could-democratize-medicine-but-better-regulation-comes-first","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/uk\/550867\/","title":{"rendered":"Contributor: AI could democratize medicine, but better regulation comes first"},"content":{"rendered":"\n<p>Last month, a group of researchers were able to manipulate <a class=\"link\" href=\"https:\/\/mindgard.ai\/blog\/doctronic-is-now-accepting-new-patients-and-unsafe-instructions\" target=\"_blank\" rel=\"nofollow noopener\">an AI-powered drug prescription service <\/a>into tripling an opioid dose and into labeling methamphetamine as safe. Days later, New York lawmakers introduced sweeping<a class=\"link\" href=\"https:\/\/www.nysenate.gov\/legislation\/bills\/2025\/S7263\" target=\"_blank\" rel=\"nofollow noopener\"> legislation<\/a> that analogizes clinical AI to a doctor practicing medicine without a license \u2014 rendering it potentially illegal for AI to provide even basic medical guidance. California has staked out a middle ground, <a class=\"link\" href=\"https:\/\/www.smithlaw.com\/newsroom\/publications\/ai-in-healthcare-faces-new-guardrails-under-californias-ab-489\" target=\"_blank\" rel=\"nofollow noopener\">enacting legislation<\/a> early this year that mandates disclosure to patients when AI is involved.<\/p>\n<p>While states continue to send conflicting signals about how best to regulate AI in medicine, millions of Americans aren\u2019t waiting for consensus. <a class=\"link\" href=\"https:\/\/www.kff.org\/health-information-trust\/poll-1-in-3-adults-are-turning-to-ai-chatbots-for-health-information-equaling-the-share-who-use-social-media-for-health\/\" target=\"_blank\" rel=\"nofollow noopener\">Data<\/a> shows that one in three Americans are now turning to AI chatbots to diagnose symptoms and direct care, a figure that doubled in just a single year. In short, AI is already practicing medicine.<\/p>\n<p>I\u2019ve worked as an emergency medicine physician in academic medical centers, a safety-net hospital and a community ER. What defines my experience, across every institution, is the staggering weight of unmet medical need: Patients who run out of an essential medication and can\u2019t get refills. A diabetic who hasn\u2019t seen his endocrinologist for months because appointments are scarce. A UTI that progresses to a kidney infection without prompt treatment. Every day, our system transforms manageable conditions into major crises and turns the ER into a substitute for all the care Americans cannot access. The human cost is<a class=\"link\" href=\"https:\/\/www.commonwealthfund.org\/publications\/newsletter-article\/new-study-us-last-among-wealthy-nations-preventable-deaths\" target=\"_blank\" rel=\"nofollow noopener\"> staggering<\/a>.<\/p>\n<p>Artificial intelligence can change this reality, and the possibilities are neither radical nor experimental. Women should be able to refill birth control without scheduling an appointment. Patients with cold sores or yeast infections shouldn\u2019t have to wait days for a callback; in<a class=\"link\" href=\"https:\/\/pmc.ncbi.nlm.nih.gov\/articles\/PMC8075133\/\" target=\"_blank\" rel=\"nofollow noopener\"> many parts of the world<\/a>, that<a class=\"link\" href=\"https:\/\/pmc.ncbi.nlm.nih.gov\/articles\/PMC9615465\/\" target=\"_blank\" rel=\"nofollow noopener\"> care is accessible<\/a> without a prescription. AI can bring equivalent access to American patients, with appropriate safety standards built in.<\/p>\n<p>Indeed, the most ambitious model of this vision is further along than most people realize: The federal government is currently<a class=\"link\" href=\"https:\/\/arpa-h.gov\/explore-funding\/programs\/advocate\" target=\"_blank\" rel=\"nofollow noopener\"> soliciting proposals<\/a> from the private sector to develop AI that will independently manage heart failure events, a disease for which only 1% of patients receive the recommended medication regimen and five-year mortality rates now exceed 50%.<\/p>\n<p>AI\u2019s potential to radically expand access to medicine is a good thing, maybe even a revolutionary one. Most Americans are not choosing between AI and their trusted family doctor. Barriers like cost and doctor shortages mean that Americans are choosing between AI and <a class=\"link\" href=\"https:\/\/kffhealthnews.org\/health-industry\/lack-of-primary-care-tipping-point\/\" target=\"_blank\" rel=\"nofollow noopener\">nothing<\/a>. Those patients deserve better, and AI is the first development in decades that promises tangible help at scale. <\/p>\n<p>That is why, alongside my clinical practice and research, I recently joined a company using AI to democratize access to medicine. I did not make that decision lightly. There is legitimate cause to be wary about a technology as powerful as AI reaching vulnerable patients without appropriate safeguards. But the answer is not the approach New York is considering. Neither physicians nor policymakers can afford to sit on the sidelines while patients fill the many gaps in our healthcare system with AI. We need regulation that is serious, enforceable and built for the speed at which this technology is progressing.<\/p>\n<p>The federal government has already begun influencing this rapidly changing field. In January, the Food and Drug Administration updated its<a class=\"link\" href=\"https:\/\/www.fda.gov\/media\/109618\/download\" target=\"_blank\" rel=\"nofollow noopener\"> software guidance<\/a> to allow AI tools to operate with less oversight when assisting doctors. Under the new rubric, software that enables a physician to independently review the basis for an AI recommendation falls outside the agency\u2019s regulation of medical devices. A textbook example would be software that can warn a doctor about dangerous drug interactions before she signs a prescription.<\/p>\n<p>But this carve-out covers AI only with a doctor in the loop. There\u2019s no comparable exemption for AI that talks directly to patients without a doctor in the room, or that makes recommendations in time-critical situations. That technology presumably remains subject to full FDA oversight, though the government has not yet weighed in. Building federal guardrails around fast-moving technology is genuinely difficult, and the FDA\u2019s caution is understandable. But the result is counterintuitive: clinical AI operating the most autonomously is, ironically, the least regulated.<\/p>\n<p>Into this vacuum, states have moved quickly and in different directions. Some, including Utah,<a class=\"link\" href=\"https:\/\/www.azag.gov\/sandbox\" target=\"_blank\" rel=\"nofollow noopener\"> Arizona<\/a> and<a class=\"link\" href=\"https:\/\/statutes.capitol.texas.gov\/?tab=1&amp;code=BC&amp;chapter=BC.553&amp;artSec=\" target=\"_blank\" rel=\"nofollow noopener\"> Texas<\/a>, are building frameworks to accelerate deployment. Others, including New York and California, are moving to curtail AI in medicine. In many respects, this is the laboratories of democracy model working as intended, allowing federal policy to find its footing through state-level experimentation and evidence collection. But 50 competing standards cannot be the answer for a technology this consequential. Patients deserve basic protections when they use clinical AI no matter where they live, and companies building these tools need to be held to uniform standards that prioritize patient safety.<\/p>\n<p>The framework we need is an extension of what the FDA already knows how to do: require independent, third-party evidence of safety and effectiveness before a clinical AI system deploys; mandate adversarial security testing as part of the approval process; and impose a uniform federal standard, with room for states to go further but not fall below it. Finally, when AI harms a patient, there must be a clear path to accountability. Medical malpractice has governed physician liability for decades. It can be adapted here.<\/p>\n<p>Many assume that regulation slows down transformative technology, but history suggests otherwise. Federal deposit insurance made people trust banks enough to use them. Federal safety standards made commercial aviation the safest form of mass transportation.<\/p>\n<p>Clinical AI needs the same foundation, and there is urgency to act now \u2014 it is already in patients\u2019 hands, moving faster than any technology we have tried to govern. The patients with the most to gain are the same ones with the most to lose if we don\u2019t get it right.<\/p>\n<p>Hashem Zikry is an assistant professor at UCLA and medical director for research and policy at Counsel Health.<\/p>\n","protected":false},"excerpt":{"rendered":"Last month, a group of researchers were able to manipulate an AI-powered drug prescription service into tripling an&hellip;\n","protected":false},"author":2,"featured_media":550868,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[43],"tags":[554,11468,189491,6905,16140,189493,102,2960,189496,3100,189494,11361,71383,189495,8332,86,56,54,55,189492],"class_list":{"0":"post-550867","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-healthcare","8":"tag-ai","9":"tag-americans","10":"tag-clinical-ai","11":"tag-day","12":"tag-doctor","13":"tag-equivalent-access","14":"tag-health","15":"tag-healthcare","16":"tag-many-part","17":"tag-medicine","18":"tag-new-york-lawmaker","19":"tag-patient","20":"tag-physician","21":"tag-software-guidance","22":"tag-state","23":"tag-technology","24":"tag-uk","25":"tag-united-kingdom","26":"tag-unitedkingdom","27":"tag-well-regulation"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts\/550867","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/comments?post=550867"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/posts\/550867\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/media\/550868"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/media?parent=550867"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/categories?post=550867"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/uk\/wp-json\/wp\/v2\/tags?post=550867"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}