Artificial intelligence (AI) has the potential to revolutionise patient care, but should it be used by the public for medical advice, and can it replace the GP?
With increased AI literacy, more people are using AI models – such as ChatGPT, Microsoft Co-Pilot, and Google Gemini – in their daily lives. From drafting emails and writing documents to creating images and seeking suggestions for dinner, we are all beginning to use it more often.
But would you trust it with your health?
The use of AI in healthcare has been revolutionary in certain areas, particularly radiology and screening. AI tools have the potential to predict issues such as treatment failure and develop highly personalised treatment plans for individual patients. It can also be transformative in areas such as cancer detection, stroke treatment and the management of chronic conditions.
And, outside of the acute hospital setting, AI can also greatly assist GPs in primary care. There are tools to help automate clinical administrative tasks that are traditionally very time-heavy for hard-pressed GPs.
But with an increasing number of patients using AI models to self-diagnose, what can GPs – or indeed the wider health service – do to mitigate the risks involved?
Despite being advised to “stay away from Dr Google”, one of the first things we all do when diagnosed with a health issue or medical condition is, of course, consult the internet. It can be a helpful resource for medical and health information; however, it is vital to seek advice from trustworthy websites and those backed by medical evidence.
With the growing popularity of AI systems such as chatbots, will Dr AI eventually replace Dr Google as our go-to for medical information, and what are the risks involved?
When browsing online via a traditional search engine, you may have noticed that AI overviews now appear at the top of the results page, providing an AI-generated snapshot or summary of the search results. A recent study from the Nielsen Norman Group found that Generative AI was “reshaping” how people searched for information online.
[ Artificial intelligence to be at the heart of innovation in coming yearOpens in new window ]
According to the group, “while AI offers compelling shortcuts around tedious research tasks, it isn’t close to completely replacing traditional search. But, even when people are using traditional search, the AI-generated overview that now tops almost all search results pages steals a significant amount of attention and often shortcuts the need to visit the actual pages.”
A chatbot doesn’t know your medical history, it’s not aware that your mother suffered from the same condition as you or that your husband is unwell and your youngest has just started secondary school, so things are stressful at home. Continuity of care, kindness, empathy, years of real-life experience, communication, the human connection – these are at the very heart of general practice.
A very unscientific poll carried out on Instagram revealed that a small percentage of people are indeed turning to ChatGPT for medical advice, and, anecdotally, some GPs have seen an increase in the number of patients who have used it before coming to see them.
Dr Joe Gallagher is a GP in Gorey, Co Wexford, and was one of several speakers at the meeting of the Irish College of General Practitioners (ICGP) recently. The theme of the meeting was Future Ready AI and Innovation in General Practice.
The conference, streamed online, covered a variety of topics, including the practical applications of AI in patient consultations, the implications of GDPR, and medico-legal considerations, as well as predictive AI models.
Speaking to The Irish Times, Dr Gallagher says the use of AI has exploded in recent years, not just in healthcare but in all aspects of life. He says that, because general practice in Ireland is one of the first medical areas to adopt IT, AI is probably penetrating this field more quickly than other healthcare areas.
According to Dr Gallagher, AI has great potential to ease the heavy administrative burden of GPs, practice nurses and administrative staff. The use of secure AI tools can greatly assist with tasks such as transcribing letters, curating medical notes and summarising patient records. AI tools available today can also listen to patient-doctor consultations and take simultaneous notes. This helps to improve communication, as it allows GPs to be fully present for their patients.
[ Artificial intelligence can be a boon for medicineOpens in new window ]
While Dr Gallagher welcomes the potential of AI in general practice to improve communication and reduce the cognitive load of GPs, he says there are also concerns about its use. These include data privacy and how AI models are used in general. These concerns affect both patients who use AI to check for a diagnosis and GPs who use it to summarise patient records, he says.
Dr Gallagher says a lot of the data being used to train AI may come from countries outside the EU with very different medical guidelines. And he also cautions about the biases inherent in AI systems. AI bias refers to the fact that AI systems can only learn and repeat the information that is fed to them. AI, therefore, can produce biased results, as they are trained on man-made data, which is biased by its very nature.
For example, a 2022 study by researchers from University College London (UCL) found that AI models used to predict liver disease from blood tests were twice as likely to miss disease in women as in men. The study, published in BMJ Health & Care Informatics and funded by UKRI (UK Research and Innovation), re-created four AI models documented in previous research as having a greater than 70 per cent success rate in identifying liver disease from the results of blood tests.
After rebuilding the algorithms and demonstrating that they achieved the same results as in earlier studies, the research team looked at how they performed by sex, and found that they missed 44 per cent of the cases of liver disease among women, compared with 23 per cent among men.
We just need to be careful about how much we trust the answer from it.
The researchers found that the two algorithms that were judged to be best at screening for disease among patients overall had the biggest gender gap – that is, they performed the worst for women compared with men.
In relation to AI bias, Dr Gallagher outlines an example of a man and a woman, both of whom are waking up at night with shortness of breath and feeling panicky. As an experiment, he asked AI for a diagnosis.
The top diagnosis given by AI for the man was heart failure, while it diagnosed the woman as suffering from anxiety.
“And that’s reflecting the bias that is in the world,” he says. “It’s not that the AI is wrong; it’s biased because we are biased.
“The problem is, if you’re a patient and you do that and you’re not aware of that bias, or you’re not aware of the deeper context around this, it’s just going to give you back the biases that are already there. So that’s one of the challenges, I think, with it, that we have to be aware that it’s prone to bias depending on what data is trained on, and particularly, a lot of data isn’t coming from within the Irish context … For that reason, we just need to be careful about how much we trust the answer from it.”
[ AI medical tools downplay symptoms in women and ethnic minoritiesOpens in new window ]
Members of the public will lack the medical knowledge of a GP to sense-check the response provided by AI, and if they simply accept the answer as fact, it has the potential to be dangerous for their health or at least cause them a lot of anxiety.
Dr Gallagher has noticed some patients using AI systems, such as ChatGPT, ahead of their consultation. He says patients prefer it to Google as it is easier to use, and it can feel like you are having a conversation with someone.
While he acknowledges that using AI can be helpful for patients as it gives them a sense of the issues they may need to raise with their GP ahead of a visit, he says some patients who use it tend to come to the surgery with pre-formed ideas.
One example he gives is a woman who used AI before the consultation and was convinced she had a childhood disease that was only found in South America. Dr Gallagher was able to reassure his patient that there was no possible way that she could be suffering from this condition, as she was from Ireland and had never travelled abroad.
“So people use it a lot to get advice and answers,” he says.
Dr Gallagher says some people are also using AI after their consultation for more information, which he says can be helpful. Rather than having to read a long leaflet, patients can converse with the AI model and ask it questions about their newly diagnosed condition in a way that is easier for them to understand.
He advises people who use AI for health information to continue to use but “don’t always believe it”.
“Put in all the queries, develop your ideas, but if it says something that doesn’t make sense to you or that you’re worried about, come back and talk to us because it’s … often not right.”
“So I think it’s good to use it. I think it’s good to explore things with it. But I think it’s really important that people are sceptical of the answers [it provides].”
Asked if AI can ever replace the GP, Dr Gallagher says in 1887 there was a concern that the telephone would replace doctors, and that hasn’t happened.
He suggests that general practice is one of the areas that is probably least likely to be replaced by AI, given the vast array of conditions that are seen in primary care. While AI works well with a specific set of data, when it gets complex, as it does every day in general practice, this can be more difficult to tease out.
Dr Gallagher quotes Dr Jesse Ehrenfeld, chairman of the American Medical Association, who said, “It is clear to me that AI will never replace physicians – but physicians who use AI will replace those who don’t.”
According to Dr Gallagher, while AI will be a really useful tool to enable doctors to have better conversations with their patients, patients will always need that “human connection” and empathy that is at the heart of general practice.
The question of whether AI could replace GPs is addressed in a medical essay published recently in the Irish Journal of Medical Science.
The essay, Dr AI Will See You Now: Could Artificial Intelligence Replace General Practitioners?, by Dr Thomas Cronin and Dr John Travers of the School of Medicine at Trinity College Dublin examines the historical evolution of AI, its applications in healthcare and potential to replace GPs.
Speaking to The Irish Times, Dr Cronin says, given the complexity of human interaction, it seems “hugely unlikely” that AI will replace GPs.
In the essay, the authors outline the three human qualities often seen as central tenets of good GP care: continuity of care, empathy and the “gut feeling” of an experienced GP in clinical decision-making.