{"id":238698,"date":"2026-01-11T00:54:15","date_gmt":"2026-01-11T00:54:15","guid":{"rendered":"https:\/\/www.newsbeep.com\/ie\/238698\/"},"modified":"2026-01-11T00:54:15","modified_gmt":"2026-01-11T00:54:15","slug":"complex-and-largely-uncharted-the-grey-area-of-ai","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/ie\/238698\/","title":{"rendered":"&#8216;Complex and largely uncharted&#8217;: the grey area of AI"},"content":{"rendered":"<p>\u201cMy AI scribe stopped working first thing this morning,\u201d says Yunzheng Jiao, principal pharmacist in research and clinical trials at Dudley Group NHS Foundation Trust.\u00a0<\/p>\n<p>Jiao uses ambient voice technology (AVT) \u2014 also known as ambient scribes \u2014 which depend on AI to take notes during conversations with patients.\u00a0<\/p>\n<p>\u201cMy first patient was late. After our consultation, I realised the AI did not transcribe our conversation. It just kept looping, round and round,\u201d he recalls. \u201cI tried three or four times to fix it but couldn\u2019t. I thought, \u2018Oh god, what do I do now?\u2019\u201d<\/p>\n<p>In the end, Jiao restarted the system and discovered the file was corrupted and unusable. \u201cI\u2019d relied too much on the transcription,\u201d he says. \u201cLuckily I still remembered what we\u2019d discussed.\u201d\u00a0<\/p>\n<p>By the time he had rewritten his notes, Jiao was 40 minutes behind for his next patient.<\/p>\n<p>Jiao is one of many pharmacy professionals now using ambient scribes to automate clinical note-taking. Tools such as <a href=\"https:\/\/www.heidihealth.com\/en-gb\" rel=\"nofollow noopener\" target=\"_blank\">Heidi<\/a> (Heidi Health),\u00a0<a href=\"https:\/\/www.accurx.com\/scribe\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Accurx Scribe<\/a>\u00a0and\u00a0<a href=\"https:\/\/tortus.ai\/\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Tortus<\/a>\u00a0\u2018listen\u2019 to consultations and use AI to transcribe and summarise conversations into structured notes or letters. These tools promise to ease the administrative burden of documentation and free up clinicians to spend more time directly with patients\u200b1\u200b. In pharmacy, they have been shown to streamline operations, reduce errors and improve patient care\u200b2,3\u200b.<\/p>\n<p>The wins are big \u2014 but their failures or misinterpretations of human nuance are less well documented.<\/p>\n<p>\u201cSometimes the transcription [\u2026] sounds funny, or it doesn\u2019t make sense. It can miss emphasis or tone, especially with patients who have strong accents or talk softly,\u201d says Jiao.<\/p>\n<p>Patients generally understand and appreciate the use of technology<\/p>\n<p>Dervis Alkan Gurol, director of Sussex Pharmacies<\/p>\n<p>Yasmin Karsan,\u00a0clinical safety officer and medical device consultant, adds that these types of AI tools can occasionally generate inaccurate details. \u201cI was told about a patient who was sitting with a clinician discussing their inhaler technique. The patient said, \u2018I have a Seretide inhaler\u2019, but the AI summarised it as, \u2018I use my Seretide inhaler two times a day.\u2019 That\u2019s factually incorrect.\u201d\u00a0<\/p>\n<p>A 2024 evaluation also found that summaries can \u201cmisgender\u201d patients and \u201cmistake critical details in transcription\u201d\u200b4\u200b.<\/p>\n<p>Both Jiao and Karsan believe accountability still sits squarely with them: as the clinician, they are responsible for reviewing and correcting notes. In its\u00a0<a href=\"https:\/\/www.england.nhs.uk\/long-read\/guidance-on-the-use-of-ai-enabled-ambient-scribing-products-in-health-and-care-settings\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">guidance on ambient scribe technology<\/a>, published in April 2025, NHS England states that liability \u201cremains complex and largely uncharted, with limited case law to provide clarity\u201d.\u00a0<\/p>\n<p>For now, any mistake is likely to be on the clinician or their employer.\u00a0<\/p>\n<p>Many of these tools are already embedded in NHS and private settings across the UK, often under self-certified Medicines and Healthcare products Regulatory Agency (MHRA) classification. Yet pharmacy-specific regulation and professional standards on their use lags behind; a\u00a0BMJ\u00a0article says: \u201cWe are in danger of being swept up in the promise of ambient scribes, without considering wider and longer-term effects.\u201d\u200b5\u200b<\/p>\n<p>The General Pharmaceutical Council (GPhC) has yet to issue guidance on any AI use and, while the Royal Pharmaceutical Society (RPS) released its own\u00a0<a href=\"https:\/\/www.rpharms.com\/recognition\/all-our-campaigns\/policy-a-z\/ai\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">recommendations<\/a>\u00a0in January 2025, there are still questions around liability and safe implementation \u2014 even as such tools become routinely used.<\/p>\n<p>Automation in action<\/p>\n<p>Ambient scribes use a form of generative AI, known as \u2018large language models\u2019 (LLMs) \u2014 the same technology behind\u00a0<a href=\"https:\/\/chatgpt.com\/\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">ChatGPT<\/a>\u00a0or\u00a0<a href=\"https:\/\/gemini.google.com\/\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Gemini<\/a>\u00a0(Google) \u2014 to convert speech into structured clinical notes. Despite occasional technical errors, Jiao calls it a \u201cgame-changer\u201d.\u00a0<\/p>\n<p>Several\u00a0studies across healthcare settings show staff morale improves and burnout rates have fallen since introducing AVT, largely owing to clinicians spending more time with patients and less time on paperwork\u200b2,6\u20138\u200b. Meanwhile, pharmacy workflow systems that use AI \u2014 such as Titan (Invatech Health), a patient management record system incorporating AI clinical checks \u2014 are\u00a0<a href=\"https:\/\/www.titanpmr.com\/blog\/titan-gets-approval-for-dispensing-doctors\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">reducing dispensing errors<\/a>\u00a0and freeing up clinical capacity.\u00a0<\/p>\n<p>Data show that the use of AI systems is supported by the public. In June 2024, a\u00a0<a href=\"https:\/\/www.health.org.uk\/reports-and-analysis\/analysis\/ai-in-health-care-what-do-the-public-and-nhs-staff-think\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">UK survey<\/a>\u00a0of 7,201 people found that just over half (54%) backed the use of AI in healthcare, rising to 61% when used for administrative purposes.\u00a0<\/p>\n<p>Jiao adds that patients have not objected to the use of it in consultations.\u00a0<\/p>\n<p>Dervis Alkan Gurol, director of Sussex Pharmacies, agrees, adding: \u201cI have a clip-on microphone and explain that AI will take notes. Patients generally understand and appreciate the use of technology.\u201d\u00a0<\/p>\n<p>Karsan says uptake of these tools is growing \u2014 clinicians welcome the workload relief \u2014 but there\u2019s a danger of over-reliance. \u201cSome clinicians trust the tool after five correct consultations,\u201d she says. But \u2018automation bias\u2019 can creep in when clinicians start trusting a system too much because it\u2019s worked well in the past. Some studies express concern that this may lead clinicians to accept what the ambient scribe tells them without critiquing or using\u00a0<a href=\"https:\/\/pharmaceutical-journal.com\/learning\/professional-judgement-and-decision-making\/all\" rel=\"nofollow noopener\" target=\"_blank\">professional judgement<\/a>\u200b9,10\u200b.<\/p>\n<p>But Gurol isn\u2019t worried about over-reliance. \u201cI don\u2019t think we can say that a clinician\u2019s memory or thinking quality will reduce,\u201d he says. \u201cOn the contrary, when I\u2019m taking notes, I\u2019m having to stop occasionally to think, am I writing this right? That affects my consultations with patients.\u201d<\/p>\n<p>Companies developing AI software are taking this into account. Jack Tabner, general manager at Accurx, a software platform that has developed an ambient scribe, says they\u2019re being \u201cthoughtful\u201d about some of the potential unintended consequences.\u00a0<\/p>\n<p>\u201cThe product is designed to encourage users to read through their notes,\u201d he explains. \u201cIf you try to complete a scribe and haven\u2019t spent long enough on the page \u2014 meaning you couldn\u2019t possibly have read it properly \u2014 a warning pops up asking, \u2018Have you read through this? Have you checked it\u2019s accurate?\u2019\u201d<\/p>\n<p>Safeguards such as this may help reduce user error, but they don\u2019t resolve the bigger ethical questions.<\/p>\n<p>Ethical challenges of AI<\/p>\n<p>Stephen Goundrey-Smith, a consultant in pharmacy informatics, believes the main ethical challenges with using AI-powered technology in healthcare are not new. \u201cPrivacy is a problem with all digital systems, conventional as well as AI ones,\u201d he says, with questions about data and bias also presenting challenges. What is new, however, is the magnification of these issues, as AI becomes less transparent and harder to check, making \u201caccuracy totally and utterly vital\u201d, he says.<\/p>\n<p>We don\u2019t have the experience yet to know what the problems are ethically with specific systems<\/p>\n<p>Stephen Goundrey-Smith, a consultant in pharmacy informatics<\/p>\n<p>\u201cWe don\u2019t have the experience yet to know what the problems are ethically with specific systems,\u201d Goundrey-Smith adds.\u00a0<\/p>\n<p>This sentiment is echoed by Abi Eccles, senior researcher in digital health at the University of Oxford and colleagues, writing in the\u00a0BMJ: \u201cUntil further evidence is available, clinicians should use the technology with diligence and caution\u201d \u2014 particularly given limited formal guidance and longer-term evidence on their use.<\/p>\n<p>Some companies are taking a proactive approach to building AI systems responsibly. Tariq Muhammad is chief executive of Invatech Health, which developed technology to automate dispensing, known as \u2018Titan\u2019.\u00a0<\/p>\n<p>According to Muhammad, Titan automates around 80% of the dispensing process, leaving pharmacists to focus on the 20% that need closer attention. \u201cBecause the clinical check is repetitive, Titan AI can learn from millions of transactions across pharmacies,\u201d he explains.\u00a0<\/p>\n<p>\u201cIn many ways, we\u2019re improving patient safety. We\u2019re saving pharmacists from brain rot and instead showing them a handful of things that actually need their focus.\u201d<\/p>\n<p>To mitigate risk, Titan uses layered oversight: random sampling, pharmacist review and independent safety panels. \u201cIt\u2019s a living, breathing system, not just something you switch on and forget,\u201d Muhammad says.\u00a0<\/p>\n<p>Like many developers, he describes Titan as \u201ca tool, not a replacement\u201d.\u00a0<\/p>\n<p>This \u2018human-in-the-loop\u2019 approach offers reassurance for clinicians and patients \u2014 but not quite full protection.\u00a0Matthew Boyd, professor of medicines safety at the University of Nottingham and vice chair of the Pharmacy Law and Ethics Association, believes all pharmacists must understand AI functionality and how and where data are stored. \u201cThe underpinning data pool informing decision-making must be really sound,\u201d he says. \u201cGovernance is critical.\u201d<\/p>\n<p>Regulation arena<\/p>\n<p>However, governance only works within a clear regulatory framework \u2013 which is still catching up with the speed of AI development for tools like AVT.<\/p>\n<p>Great Britain follows the\u00a0<a href=\"https:\/\/www.legislation.gov.uk\/uksi\/2002\/618\/contents\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">UK Medical Device Regulations 2002<\/a> (MDR), which sets essential requirements for safety and performance. The MHRA, which oversees these regulations, indicated in <a href=\"https:\/\/www.gov.uk\/government\/news\/first-major-overhaul-of-medical-device-regulation-comes-into-force-across-great-britain\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">June 2025<\/a>\u00a0that AI software with a\u00a0<a href=\"https:\/\/assets.publishing.service.gov.uk\/media\/64a7d22d7a4c230013bba33c\/Medical_device_stand-alone_software_including_apps__including_IVDMDs_.pdf\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">medical purpose<\/a>\u00a0\u2014 as defined in regulation \u2014 is likely to qualify as a\u00a0\u2018medical device\u2019.<\/p>\n<p>If\u00a0AI software\u00a0has a \u2018medical purpose\u2019 as defined in the UK MDR, it is likely a medical device<\/p>\n<p>Hadi Shahidipour, a national clinical lead and senior regulatory specialist<\/p>\n<p>Hadi Shahidipour, a national clinical lead and senior regulatory specialist, who oversees medical device technical documentation and compliance for NHS England, says it\u2019s a \u201cmyth\u201d that AI\u00a0software used for healthcare purposes isn\u2019t regulated.\u00a0<\/p>\n<p>\u201cIf\u00a0AI software\u00a0has a \u2018medical purpose\u2019 as defined in the UK MDR, it is likely a medical device,\u201d he explains.\u00a0<\/p>\n<p>\u201cMost AI has sufficient complexity to meet this definition,\u201d\u00a0he adds, although some international standards still need updating.<\/p>\n<p>Medical devices are classified from class I to class III according to risk. Most AI or health IT medical devices \u2014 including Heidi, Titan and Accurx Scribe \u2014 are currently class I, which is the lowest risk category and only requires companies to self-certify. However,\u00a0<a href=\"https:\/\/www.gov.uk\/government\/publications\/implementation-of-the-future-regulation-of-medical-devices\/implementation-of-the-future-regulations#futurecore-regulations\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">the MHRA plans to update medical device regulation<\/a>\u00a0in 2026, which could lead to the reclassification of some devices, meaning they may require independent audit before they can be registered. In October 2025, it also\u00a0<a href=\"https:\/\/www.gov.uk\/government\/news\/patients-to-benefit-as-uk-and-us-regulators-forge-new-collaboration-on-medical-technologies-and-ai\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">announced a National AI Commission<\/a>\u00a0to shape recommendations for future regulatory oversight.<\/p>\n<p>Shahidipour notes that technical files for class I devices can range from \u201cvery good\u201d to \u201cnon-existent\u201d in quality because, essentially, \u201cyou\u2019re checking your own homework\u201d.\u00a0<\/p>\n<p>Muhammad shares this concern, warning that self-certification could lead to \u201cmany new products\u2026 with poor safety standards\u201d.\u00a0<\/p>\n<p>Yass Omar, head of legal and regulatory affairs at ambient scribe company Heidi Health, admits that moving into higher-risk classifications is \u201ctime-consuming, resource-intensive and costly\u201d.\u00a0<\/p>\n<p>\u201cIt\u2019s a balance,\u201d he says. \u201cWe want to build innovative, impactful technology, but we have to work closely with regulators and government to do it safely.\u201d<\/p>\n<p>In spring 2024, the MHRA launched the\u00a0<a href=\"https:\/\/www.gov.uk\/government\/collections\/ai-airlock-the-regulatory-sandbox-for-aiamd\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">AI Airlock programme<\/a>, allowing health technology companies to test their AI tools in a sandbox environment under regulatory supervision. Karsan was a stakeholder on that programme and calls it a \u201cpositive step\u201d to help inform future regulation. The first pilot ended in April 2025, with a report pending and a\u00a0<a href=\"https:\/\/www.gov.uk\/government\/publications\/ai-airlock-phase-2-cohort\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">second phase<\/a>\u00a0planned for 2026.<\/p>\n<p>Goundrey-Smith adds that it will be crucial to \u201cjoin the dots\u201d with regulations and professional standards.\u00a0<\/p>\n<p>\u201cThere\u2019s work to be done to ensure our professional standards retain integrity in an AI world,\u201d he explains. \u201cFor example, what does it mean for me as a pharmacist to keep data confidential if the system I\u2019m using could disclose it to a third party without my control?\u201d<\/p>\n<p>Who is responsible?<\/p>\n<p>While the UK is committed to technological innovation, as outlined in the\u00a0<a href=\"https:\/\/www.rpharms.com\/england\/nhs-transformation\/nhs-10-year-plan-and-pharmacy#:~:text=and%20use%20healthcare.-,Artificial%20intelligence%20(AI),-Artificial%20Intelligence%20(AI\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">NHS ten-year health plan<\/a>, the question of liability remains unanswered. In 2024, study results suggested that clinicians risk becoming a \u201cliability sink\u201d for AI, absorbing responsibility from AI systems much like a \u2018heat sink\u2019 absorbs excess heat\u200b11\u200b.<\/p>\n<p><a href=\"https:\/\/www.england.nhs.uk\/long-read\/guidance-on-the-use-of-ai-enabled-ambient-scribing-products-in-health-and-care-settings\/#a2-10-liability:~:text=within%20NHS%20organisations,for%20its%20outputs\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">NHS England\u2019s current guidance<\/a>\u00a0on ambient scribes, published in April 2025, states that if no specific liable party can be established \u2014 or if a supplier \u201clacks sufficient coverage\u201d \u2014 liability may default to the NHS trust or primary care provider, which holds a \u201cnon-delegable duty\u201d to ensure patient safety and quality of care.\u00a0<\/p>\n<p><a href=\"https:\/\/www.gov.wales\/safe-and-responsible-adoption-ambient-voice-technologies-ai-scribes-clinical-settings-whc2025026-0#179202\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">NHS Wales guidance<\/a>, published in August 2025, emphasises safe AVT adoption without mentioning liability, while\u00a0<a href=\"https:\/\/www.digihealthcare.scot\/app\/uploads\/2025\/07\/NHS-Scotland-Interim-position-on-use-of-Ambient-Voice-Technologies-July-2025-003.pdf\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">NHS Scotland<\/a> advises clear contracts defining roles and liability, and consulting medical defence bodies when in doubt.<\/p>\n<p>For now, pharmacists rely largely on professional judgement.\u00a0The <a href=\"https:\/\/www.rpharms.com\/recognition\/all-our-campaigns\/policy-a-z\/ai#:~:text=The%20General%20Pharmaceutical%20Council%20(GPhC)%2C%20regulator%20for%20pharmacy%20professionals%20and%20pharmacy%20premises%2C%20will%20incorporate%20advancements%20in%20artificial%20intelligence%20(AI)%20deployment%20into%20future%20updates%20of%20professional%20standards\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">RPS has said<\/a>\u00a0that the GPhC will include AI deployment in future professional standards and, in October 2025,\u00a0<a href=\"https:\/\/www.rpharms.com\/about-us\/news\/details\/we-are-calling-for-digital-and-ai-skills-to-become-a-core-competency-for-healthcare-professionals\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">called for<\/a>\u00a0digital and AI skills to become a core competency for all healthcare professionals.\u00a0<\/p>\n<p>Amareen Kamboh, head of pharmacy workforce for NHS Hampshire and Isle of Wight, believes education is crucial. \u201cIt\u2019s our responsibility\u2026 to equip the workforce with the necessary skills for the future,\u201d she says.\u00a0<\/p>\n<p>\u201cAt the same time, we can\u2019t forget the current workforce, many of whom don\u2019t yet have those skills.\u201d<\/p>\n<p>Patients are central to the conversation surrounding AI in healthcare, yet evidence of direct impact is still limited. A spokesperson from\u00a0<a href=\"https:\/\/patient.info\/\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Patient.info<\/a>, a patient-facing health platform, said it had not yet seen reported cases where AI use in pharmacy consultations has directly caused problems around confidentiality or consent, but warned that \u201cthese are real risks if the technology is adopted without clear safeguards\u201d.<\/p>\n<p>Insurance considerations add another layer of complexity. Keith Bryceland, principal at Segment Risk, said that the growing use of AI \u201craises important questions around professional accountability\u201d and warned that \u201cpharmacists will remain accountable for their clinical decisions, even when AI is involved\u201d.\u00a0<\/p>\n<p>Karsan points out that while AI can make mistakes, so can humans. \u201cAt what point and at what risk appetite are you accepting the fallibility of AI?\u201d she asks.\u00a0<\/p>\n<p>Kamboh concurs. \u201cAI can make things more efficient and safer \u2014 but if an error occurs, it can feel scarier because it feels out of our control,\u201d she says.\u00a0<\/p>\n<p>\u201cAs a patient, if something goes wrong and you hear \u2018It\u2019s down to the machine\u2019 \u2014 it doesn\u2019t carry the same understanding as human error.\u201d<\/p>\n<p>For developers, this brings a level of moral responsibility. Muhammad envisions a future where \u201cAI is essential, not optional\u201d.\u00a0<\/p>\n<p>\u201cIt\u2019s clear the primary care system is broken, so pharmacists have an opportunity to plug that gap,\u201d he says. \u201cThey\u2019re capable and qualified now as prescribers. But we must work differently to meet demand, using technology like AI to scale services safely.\u201d<\/p>\n<p>Ultimately, it is down to the individual pharmacist, and their employer, to decide how best to use AI tools while regulations and guidance continue to develop. In the meantime, pharmacists weigh the risks versus rewards \u2013 as the spokesperson for Patient.info puts it: \u201cPharmacists work in a trusted role at the frontline of\u00a0patient\u00a0care, and it\u2019s essential that any use of AI tools [\u2026] meets the same standards of confidentiality, transparency and professional judgement as traditional practice.\u201d<\/p>\n<p>Box: Further reading from The Pharmaceutical Journal<\/p>\n<p>1.<\/p>\n<p>Sasseville M, Yousefi F, Ouellet S, et al. The Impact of AI Scribes on Streamlining Clinical Documentation: A Systematic Review. Healthcare. 2025;13(12):1447. doi:<a href=\"https:\/\/doi.org\/10.3390\/healthcare13121447\" rel=\"nofollow noopener\" target=\"_blank\">10.3390\/healthcare13121447<\/a><\/p>\n<p>2.<\/p>\n<p>Duggan MJ, Gervase J, Schoenbaum A, et al. Clinician Experiences With Ambient Scribe Technology to Assist With Documentation Burden and Efficiency. JAMA Netw Open. 2025;8(2):e2460637. doi:<a href=\"https:\/\/doi.org\/10.1001\/jamanetworkopen.2024.60637\" rel=\"nofollow noopener\" target=\"_blank\">10.1001\/jamanetworkopen.2024.60637<\/a><\/p>\n<p>3.<\/p>\n<p>Gonz\u00e1lez-P\u00e9rez Y, Montero Delgado A, Martinez Sesmero JM. [Translated article] Introducing artificial intelligence to hospital pharmacy departments. Farmacia Hospitalaria. 2024;48:TS35-TS44. doi:<a href=\"https:\/\/doi.org\/10.1016\/j.farma.2024.04.001\" rel=\"nofollow noopener\" target=\"_blank\">10.1016\/j.farma.2024.04.001<\/a><\/p>\n<p>4.<\/p>\n<p>Bundy H, Gerhart J, Baek S, et al. Can the Administrative Loads of Physicians be Alleviated by AI-Facilitated Clinical Documentation? J GEN INTERN MED. 2024;39(15):2995-3000. doi:<a href=\"https:\/\/doi.org\/10.1007\/s11606-024-08870-z\" rel=\"nofollow noopener\" target=\"_blank\">10.1007\/s11606-024-08870-z<\/a><\/p>\n<p>5.<\/p>\n<p>Eccles A, Pelly T, Pope C, Powell J. Unintended consequences of using ambient scribes in general practice. BMJ. 2025;390:e085754. doi:<a href=\"https:\/\/doi.org\/10.1136\/bmj-2025-085754\" rel=\"nofollow noopener\" target=\"_blank\">10.1136\/bmj-2025-085754<\/a><\/p>\n<p>6.<\/p>\n<p>Nambudiri VE, Watson AJ, Buzney EA, Kupper TS, Rubenstein MH, Yang FSC. Medical Scribes in an Academic Dermatology Practice. JAMA Dermatol. 2018;154(1):101. doi:<a href=\"https:\/\/doi.org\/10.1001\/jamadermatol.2017.3658\" rel=\"nofollow noopener\" target=\"_blank\">10.1001\/jamadermatol.2017.3658<\/a><\/p>\n<p>7.<\/p>\n<p>Tierney AA, Gayre G, Hoberman B, et al. Ambient Artificial Intelligence Scribes to Alleviate the Burden of Clinical Documentation. NEJM Catalyst. 2024;5(3). doi:<a href=\"https:\/\/doi.org\/10.1056\/cat.23.0404\" rel=\"nofollow noopener\" target=\"_blank\">10.1056\/cat.23.0404<\/a><\/p>\n<p>8.<\/p>\n<p>Olson KD, Meeker D, Troup M, et al. Use of Ambient AI Scribes to Reduce Administrative Burden and Professional Burnout. JAMA Netw Open. 2025;8(10):e2534976. doi:<a href=\"https:\/\/doi.org\/10.1001\/jamanetworkopen.2025.34976\" rel=\"nofollow noopener\" target=\"_blank\">10.1001\/jamanetworkopen.2025.34976<\/a><\/p>\n<p>10.<\/p>\n<p>Kocaballi AB, Ijaz K, Laranjo L, et al. Envisioning an artificial intelligence documentation assistant for future primary care consultations: A co-design study with general practitioners. Journal of the American Medical Informatics Association. 2020;27(11):1695-1704. doi:<a href=\"https:\/\/doi.org\/10.1093\/jamia\/ocaa131\" rel=\"nofollow noopener\" target=\"_blank\">10.1093\/jamia\/ocaa131<\/a><\/p>\n<p>11.<\/p>\n<p>Lawton T, Morgan P, Porter Z, et al. Clinicians risk becoming \u201cliability sinks\u201d for artificial intelligence. Future Healthcare Journal. 2024;11(1):100007. doi:<a href=\"https:\/\/doi.org\/10.1016\/j.fhj.2024.100007\" rel=\"nofollow noopener\" target=\"_blank\">10.1016\/j.fhj.2024.100007<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"\u201cMy AI scribe stopped working first thing this morning,\u201d says Yunzheng Jiao, principal pharmacist in research and clinical&hellip;\n","protected":false},"author":2,"featured_media":238699,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[34],"tags":[103,397,396,61,60],"class_list":{"0":"post-238698","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-healthcare","8":"tag-health","9":"tag-health-care","10":"tag-healthcare","11":"tag-ie","12":"tag-ireland"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/posts\/238698","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/comments?post=238698"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/posts\/238698\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/media\/238699"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/media?parent=238698"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/categories?post=238698"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/ie\/wp-json\/wp\/v2\/tags?post=238698"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}