{"id":276749,"date":"2025-11-11T02:58:18","date_gmt":"2025-11-11T02:58:18","guid":{"rendered":"https:\/\/www.newsbeep.com\/au\/276749\/"},"modified":"2025-11-11T02:58:18","modified_gmt":"2025-11-11T02:58:18","slug":"ai-isnt-coming-for-doctors-its-already-in-the-room","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/au\/276749\/","title":{"rendered":"AI Isn\u2019t Coming for Doctors. It\u2019s Already in the Room."},"content":{"rendered":"<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color min-h-[6.375rem] lg:min-h-[4.75rem] dropcap text-left\" data-testid=\"paragraph-content\">As hospitals turn to AI, patients may no longer know who\u2014or what\u2014is making their medical decisions. As ER physicians, we see how AI guidance is changing what it means to walk into an emergency room.\u00a0<\/p>\n<p class=\"rich-text self-baseline font-graphik text-body-large text-black-coffee mb-0 focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">This isn\u2019t a policy story. It\u2019s a cultural one: about what it means to have faith in your doctor when the \u201cdoctor\u201d might be an algorithm. According to a recent <a href=\"https:\/\/pmc.ncbi.nlm.nih.gov\/articles\/PMC11874537\/?utm\" rel=\"nofollow noopener\" target=\"_blank\">Journal of American College of Emergency Physicians Primer<\/a>, AI applications in emergency departments are already being used for triage, risk-prediction, and staffing models, the plans that help hospitals make sure they have the right number and mix of doctors, nurses, and other staff working at the right times to care for patients. Patients may not know if the person treating them is a doctor or an AI-assisted hybrid. That can feel seamless, or unsettling, depending on the stakes. We are grappling with a quiet transformation of the ER, where cost pressures, staffing shortages, and AI copilots are rewriting what it means to see a doctor, and what it means to trust one.<\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">The shift from physicians to AI isn\u2019t just a staffing solution, but rather a seismic change in how medical decisions are made. Each comes with trade-offs. AI can process mountains of data in seconds, but it cannot look a patient in the eye and recognize fear, appreciate the quiet moments of human suffering, or pick up on the unspoken clues that come from holding the hand of someone in pain. Part of our 10,000-plus hours of medical training to become ER doctors is developing the gut instinct that something is wrong, even when a patient\u2019s vital signs and lab work look fine. It&#8217;s catching the subtle clues\u2014a hint of confusion, a faint slur in a patient&#8217;s speech, the quiet panic in their eyes\u2014that a patient might not mention, nor can an algorithm perceive. The human element, the essence of trust and compassion, is exactly where AI stumbles.<\/p>\n<p class=\"rich-text self-baseline font-graphik text-body-large text-black-coffee mb-0 focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">Tech companies are racing to integrate AI into the clinical space by creating digital triage systems, diagnostic copilots, and decision-support tools designed to augment or even replace physician oversight. And hospitals are moving quickly to adopt it, drawn by the promise of lower costs and sharper diagnostic accuracy. In one recent <a href=\"https:\/\/www.nature.com\/articles\/s41746-025-01543-z\" rel=\"nofollow noopener\" target=\"_blank\">Nature study<\/a>, AI performed on par with non-expert physicians, evidence of how quickly algorithms are catching up to human clinicians in the exam room. OpenAI, Google, and Microsoft are explicitly testing AI-based health care applications. One of those companies, Open Evidence AI, is building an AI-powered tool to give clinicians quick, evidence-based answers to medical questions, and is already valued at $3.5 billion.\u00a0<\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">To be sure, there are places where AI can shine. It can surface patterns invisible to even the most experienced clinician, linking a lab result from months ago with a medication list and a cluster of symptoms to flag a severe infection risk before anyone else sees it. It can pull up obscure drug interactions, support decision-making, and speed up documentation, leaving physicians more time for patients and with less burnout. Used correctly, AI is less a replacement for intuition and more a force multiplier for it.<\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">Read More: <a href=\"https:\/\/time.com\/7310911\/ambient-ai-doctor-burnout-health-care\/\" rel=\"nofollow noopener\" target=\"_blank\">AI Can Fix the Most Soul-Sucking Part of Medicine<\/a><\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">Maybe more than anything, what\u2019s new is that both patients and physicians are now using AI, but not in the same way.\u00a0<\/p>\n<p class=\"rich-text self-baseline font-graphik text-body-large text-black-coffee mb-0 focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">A few nights ago, a young woman came into the ER with chest pain. Her tests were all normal, but she still seemed on edge. When I asked if she was worried about something, she admitted she\u2019d gone down a ChatGPT rabbit hole after noticing a few skipped heartbeats. The chatbot told her she might have <a href=\"https:\/\/www.hopkinsmedicine.org\/health\/conditions-and-diseases\/arrhythmogenic-right-ventricular-dysplasia--cardiomyopathy-arvdc\" rel=\"nofollow noopener\" target=\"_blank\">arrhythmogenic right ventricular dysplasia<\/a>: a rare, deadly heart condition. (She didn\u2019t.) The panic that followed likely caused the symptoms that brought her in.<\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">Another patient, a young man, arrived certain he had appendicitis because ChatGPT told him so. This time, he was right. His symptoms were textbook, and the medical student seeing him independently surmised the same diagnosis. The AI helped the patient to find his diagnosis sooner and seek treatment. Yet he still required the skillful hands of a surgeon to remove his appendix.\u00a0<\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">Read More: <a href=\"https:\/\/time.com\/7321821\/chatgpt-ai-how-to-use-for-health-safely\/\" rel=\"nofollow noopener\" target=\"_blank\">9 Doctor-Approved Ways to Use ChatGPT for Health Advice<\/a><\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">That\u2019s the paradox of this moment: the same technology that fuels confusion and fear can also sharpen insight and speed care. It\u2019s not just changing how we diagnose, it\u2019s also changing how patients arrive and who is caring for them. Cost, staffing, and technology have blurred the line between human and machine care, ushering in a new kind of medicine: patients treated by clinicians whose most powerful colleague may be an algorithm.<\/p>\n<p class=\"rich-text self-baseline font-graphik text-body-large text-black-coffee mb-0 focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">The problem isn\u2019t just that AI could get a diagnosis wrong, it\u2019s that long-term AI use could also jeopardize a clinician\u2019s insight. In one <a href=\"https:\/\/www.thelancet.com\/journals\/langas\/article\/PIIS2468-1253%2825%2900133-5\/abstract\" rel=\"nofollow noopener\" target=\"_blank\">Lancet study,<\/a> doctors were less likely to detect possibly-cancerous spots on colonoscopy after they were used to using an AI tool. Authors hypothesized that the more they relied on an algorithm, the less human judgment they exercised.<\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">Read More: <a href=\"https:\/\/time.com\/7309274\/ai-lancet-study-artificial-intelligence-colonoscopy-cancer-detection-medicine-deskilling\/\" rel=\"nofollow noopener\" target=\"_blank\">Using AI Made Doctors Less Skilled at Spotting Cancer<\/a><\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">That shift to integrating AI or non-physician clinicians into the ER is not inherently bad, but it is often invisible to patients. And that\u2019s the problem.\u00a0<\/p>\n<p class=\"rich-text mb-6 self-baseline font-graphik text-body-large text-black-coffee focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">Patients deserve to know when their care is guided by AI, who is ultimately responsible for the decisions being made, and what safeguards exist when the \u201cdoctor in the room\u201d might be an algorithm.\u00a0<\/p>\n<p class=\"rich-text self-baseline font-graphik text-body-large text-black-coffee mb-0 focus-visible:outline focus-visible:outline-black-coffee focus-visible:outline-2 focus-visible:outline-offset-2 focus-visible:shadow-focus-color text-left\" data-testid=\"paragraph-content\">Transparency won\u2019t stop the march of technology, but it may help preserve something medicine can\u2019t afford to lose: trust.<\/p>\n","protected":false},"excerpt":{"rendered":"As hospitals turn to AI, patients may no longer know who\u2014or what\u2014is making their medical decisions. As ER&hellip;\n","protected":false},"author":2,"featured_media":276750,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[34],"tags":[256,64,63,137,500],"class_list":{"0":"post-276749","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-healthcare","8":"tag-ai","9":"tag-au","10":"tag-australia","11":"tag-health","12":"tag-healthcare"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts\/276749","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/comments?post=276749"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts\/276749\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/media\/276750"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/media?parent=276749"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/categories?post=276749"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/tags?post=276749"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}