{"id":322641,"date":"2025-12-02T11:34:28","date_gmt":"2025-12-02T11:34:28","guid":{"rendered":"https:\/\/www.newsbeep.com\/au\/322641\/"},"modified":"2025-12-02T11:34:28","modified_gmt":"2025-12-02T11:34:28","slug":"when-ai-is-in-the-room-rethinking-the-medical-conversation","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/au\/322641\/","title":{"rendered":"When AI is in the room: rethinking the medical conversation"},"content":{"rendered":"\n<p>The consultation room has a new participant. It arrives when patients pull up ChatGPT-generated symptom analyses. It appears when we use GenAI to check drug interactions or draft letters. And it\u2019s always listening when ambient AI scribes document our conversations. Whether acknowledged or not, GenAI is fundamentally changing what happens in the consultation room.<\/p>\n<p>This article draws from a new <a href=\"https:\/\/www.bmj.com\/collections\/gen-AI\" rel=\"nofollow noopener\" target=\"_blank\">BMJ series<\/a> examining how generative AI is reshaping doctor-patient relationships. The series introduces the concept of \u201ctriadic care\u201d, the reality that today\u2019s medical encounters often involve not just patient and GP, but also AI systems.<\/p>\n<p>AI usage is exploding on both sides<\/p>\n<p>Since ChatGPT\u2019s release in late 2022, patient use of GenAI chatbots for health information has grown exponentially. In Australia, <a href=\"https:\/\/www.mja.com.au\/journal\/2025\/222\/4\/use-chatgpt-obtain-health-information-australia-2024-insights-nationally\" rel=\"nofollow noopener\" target=\"_blank\">about one in ten adults<\/a> now seeks health advice from ChatGPT. Patients use it to research symptoms, clarify medical jargon, create care plans when specialist support is delayed, and seek second opinions.<\/p>\n<p>Some examples are striking. A US mother used ChatGPT to help diagnose her child\u2019s <a href=\"https:\/\/www.ndtv.com\/feature\/chatgpt-helps-woman-diagnose-sons-rare-condition-after-17-doctors-fail-8163351\" rel=\"nofollow noopener\" target=\"_blank\">tethered cord syndrome<\/a> after traditional consultations missed it. But there\u2019s also harm: AI has <a href=\"https:\/\/link.springer.com\/article\/10.1007\/s00508-024-02329-1\" rel=\"nofollow noopener\" target=\"_blank\">misclassified neurological symptoms<\/a>, delaying stroke treatment.<\/p>\n<p>On the clinician side, adoption is equally rapid. <a href=\"https:\/\/www.ama-assn.org\/practice-management\/digital-health\/2-3-physicians-are-using-health-ai-78-2023\" rel=\"nofollow noopener\" target=\"_blank\">Two-thirds of US physicians<\/a> now use AI tools, up from 38% a year earlier. The fastest-growing application is AI-powered medical scribes, ambient systems that automatically generate clinical notes, promising to reduce documentation burden.<\/p>\n<p>In Australia, we\u2019re seeing similar patterns but lack data. My team at Macquarie University is conducting the first national survey of Australian GPs using digital scribes. If you\u2019re using these tools, please share your experience at <a href=\"https:\/\/gpscribesurvey.getcds.net\/\" rel=\"nofollow noopener\" target=\"_blank\">https:\/\/gpscribesurvey.getcds.net\/<\/a>.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"683\" src=\"https:\/\/www.newsbeep.com\/au\/wp-content\/uploads\/2025\/12\/shutterstock_1242800767-1024x683.jpg\" alt=\"When AI is in the room: rethinking the medical conversation - Featured Image\" class=\"wp-image-79807\"  \/>Today\u2019s medical encounters often involve not just patient and GP, but also AI systems (goodluz \/ Shutterstock).<\/p>\n<p>Why this is different from \u201cDr Google\u201d<\/p>\n<p>Unlike a Google search that returns verifiable links, AI chatbots deliver synthesised reasoning without producing a completely verifiable output. When ChatGPT generates a differential diagnosis, neither you nor your patient can trace how it reached that conclusion. Medical knowledge then becomes something to interpret rather than verify.<\/p>\n<p>The statistics are concerning: <a href=\"https:\/\/www.jmir.org\/2025\/1\/e68560\/\" rel=\"nofollow noopener\" target=\"_blank\">only 19% of users<\/a> cross-check chatbot outputs, yet people trust AI responses as much as doctors\u2019 advice, even when inaccurate. This trust-without-verification fundamentally alters consultation dynamics. Yet the potential and capabilities AI opens are undeniable, and soon practising with AI assistance could be seen as inevitable as working with the internet, possible without it, but increasingly impractical.<\/p>\n<p>The risks and opportunities<\/p>\n<p>AI can empower patients to prepare better questions and participate more actively in decisions. But three key risks emerge:<\/p>\n<p>Trust erosion: Research shows patient satisfaction drops when AI authorship is revealed after the fact. In one study, patients rated AI-drafted messages as more empathetic than human ones, until they learned AI wrote them. Transparency needs collaborative review, not post-hoc disclosure.<\/p>\n<p>Digital divides: This poses particular risks for CALD populations and people without English as first language. Most AI systems are trained on English data and perform poorly otherwise, potentially widening health inequities.<\/p>\n<p>Accountability gaps: Triadic care means doctor and patient retain responsibility for decisions through shared understanding. AI provides input, but accountability cannot be delegated to algorithms making autonomous recommendations.<\/p>\n<p>What doctors can do now<\/p>\n<p>The first step is simple: make AI use visible. For patient AI use, try: \u201cHave you used AI or looked anything up about this? Let\u2019s review it together.\u201d This normalises disclosure and creates opportunities for collaborative interpretation.<\/p>\n<p>For your own AI use, brief transparency builds trust. \u201cI\u2019ve checked this interaction with an AI tool, and here\u2019s what it suggests\u201d, paired with joint review, increases patient trust in both you and the decision.<\/p>\n<p>Documentation matters too. Health systems should add simple \u201cAI involvement\u201d fields in electronic records. Recording why you accepted, modified, or rejected an AI suggestion creates patterns for safety learning and audit trails. For digital AI scribes, gaining patient consent is key before using them for consultation transcription.<\/p>\n<p>Generalist doctors are well-suited to navigate this shift. Our core task has always been dialogue and interpretation with patients, working with uncertainty and helping people weigh options among both general and specialised advice. The consultation is becoming less about exchanging facts and more about co-creating understanding, now with a third voice that needs critical interpretation.<\/p>\n<p>Moving forward together<\/p>\n<p>Australian healthcare is making progress on AI governance. The <a href=\"https:\/\/consultations.health.gov.au\/medicare-benefits-and-digital-health-division\/safe-and-responsible-artificial-intelligence-in-he\/\" rel=\"nofollow noopener\" target=\"_blank\">Department of Health and Aged Care<\/a> has conducted consultations on safe and responsible AI in healthcare, the <a href=\"https:\/\/www.racgp.org.au\/running-a-practice\/technology\/artificial-intelligence-ai\/conversational-artificial-intelligence\" rel=\"nofollow noopener\" target=\"_blank\">RACGP<\/a> has released guidance on conversational AI and scribes, and professional bodies are developing frameworks. But integrating AI into everyday consultations requires practical action at the frontline.<\/p>\n<p>GPs can start now by making AI use visible in our practices, asking patients about their AI use, documenting our own, and insisting on transparent systems rather than black-box automation. These grassroots practices will inform the broader policy work happening at state and federal levels.<\/p>\n<p>The second and third papers in the BMJ series, published in coming weeks, examine patient perspectives and the clinical competencies needed to use AI transparently and effectively. Together, these papers argue that triadic care is inevitable but manageable, if we make it visible and maintain human judgment at the centre.<\/p>\n<p>The consultation room has changed. There\u2019s now a third voice in the dialogue. Australian GPs can shape how that voice is integrated, not as replacement for clinical expertise, but as a tool we interpret together, critically and collaboratively, in service of better care.<\/p>\n<p>The first step is transparency. The second is dialogue. That conversation needs to start now.<\/p>\n<p>Dr David Fraile Navarro is a trained GP and a Research Fellow in Generative AI at the Centre for Health Informatics, Australian Institute of Health Innovation, Macquarie University, Sydney. He is lead author of \u201cGenerative AI and the changing dynamics of clinical consultations,\u201d published in the BMJ this week as part of a three-paper series on AI in clinical encounters.<\/p>\n<p>If you\u2019re an Australian GP using AI-powered medical scribes, please contribute to the national survey at <a href=\"https:\/\/gpscribesurvey.getcds.net\/\" rel=\"nofollow noopener\" target=\"_blank\">https:\/\/gpscribesurvey.getcds.net\/<\/a><\/p>\n<p>The statements or opinions expressed in this article reflect the views of the authors and do not necessarily represent the official policy of the AMA, the\u00a0MJA\u00a0or\u00a0InSight+\u00a0unless so stated.\u00a0<\/p>\n<p>Subscribe to the free\u00a0InSight+\u00a0weekly newsletter\u00a0<a rel=\"noreferrer noopener nofollow\" href=\"https:\/\/insightplus.mja.com.au\/subscription\/\" target=\"_blank\">here<\/a>. It is available to all readers, not just registered medical practitioners.\u00a0<\/p>\n<p>If you would like to submit an article for consideration, send a Word version to\u00a0<a href=\"https:\/\/insightplus.mja.com.au\/2025\/47\/when-ai-is-in-the-room-rethinking-the-medical-conversation\/mailto:mjainsight-editor@ampco.com.au\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">mjainsight-editor@ampco.com.au<\/a>.\u00a0<\/p>\n","protected":false},"excerpt":{"rendered":"The consultation room has a new participant. It arrives when patients pull up ChatGPT-generated symptom analyses. It appears&hellip;\n","protected":false},"author":2,"featured_media":322642,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[34],"tags":[64,63,137,500],"class_list":{"0":"post-322641","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-healthcare","8":"tag-au","9":"tag-australia","10":"tag-health","11":"tag-healthcare"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts\/322641","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/comments?post=322641"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/posts\/322641\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/media\/322642"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/media?parent=322641"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/categories?post=322641"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/au\/wp-json\/wp\/v2\/tags?post=322641"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}