When Katie finally sat down in her GP’s surgery in November she had been in pain for years. Since the birth of her daughter in July 2023, sex had been agony. Yet the mother of three, a teacher, had delayed booking an appointment — she simply didn’t have the time.
”I had to work myself up to it,” said Katie — not her real name. “I didn’t have childcare, so I had to use code words during the appointment so my kids didn’t come away saying anything they shouldn’t.”
After explaining her pain to a stranger, she was met with a shrug. “I was told that this is just what happens after kids. I felt so ignored and so awful. I cried; I felt invisible.”
Feeling failed by a human doctor, she turned to ChatGPT. “I know that AI is programmed to acknowledge me; it said something like, ‘that must be really stressful and tough to deal with right now,’ and then gave me a list of things my pain could be attributed to. It instantly put me at ease,” Katie, 28, said.
She is now in the majority. A study of 1,000 UK women aged 20 to 50 found that 53 per cent would use a free AI tool for medical advice, even while acknowledging the 20 per cent error rate.
The report by Intimina, a Swedish company that makes women’s health products, Sixty-six per cent of women admitted they had avoided booking a GP appointment or collecting a prescription to avoid associated costs and 47 per cent said the cost of living had led them to delay buying treatments until symptoms felt “severe”.

Some women struggle to find time for appointments with a doctors, or feel their concerns are dismissed when they do go
GETTY IMAGES
A London School of Economics study last year found that AI models systematically downplayed women’s symptoms compared to men’s.
Katie discovered the chatbot’s limits. Asking for help for intense burning and itching, the AI suggested she had thrush so she bought over-the-counter medication. “The cream stung immediately,” she said. “I was in so much pain that I had to get it off my body instantly. I was uncomfortable for three days.”
She did not have thrush but bacterial vaginosis and the incorrect treatment led to further infections. For her persistent pain, the bot suggested generic pelvic floor exercises but a private osteopath later diagnosed pelvic floor tension.
Dr Susanna Unsworth, a women’s health expert with Intimina, said: “AI lacks the clinical nuance essential in intimate health. Self-treating based on a chatbot’s guess can lead to inappropriate treatment and prolonged suffering.”

Despite the errors, Katie still uses chatbot. “AI helps me collect my thoughts as to what my symptoms are,” she said. “It provides all the information in one neat list that I take to my GP so I can confidently speak about what is wrong and what I need.” She said she would still use it as a “survival tool” given her busy schedule and limited childcare.
Growing numbers of women are also using AI for therapy as a substitute for overstretched mental health professionals.
Unsworth said the healthcare system had to adapt. “There is an opportunity for the NHS to ensure trusted, evidence-based information is available in the digital spaces where people are already looking for answers,” she said.
How to use AI for health safely — Dr Susanna Unsworth’s Dos and Don’ts
• Do use AI to organise your thoughts and draft questions so you make the most of limited consultation time.
• Do use AI to track and understand patterns, like logging menstrual cycles or triggers.
• Don’t share identifiable details, medical records, or photos of symptoms to unregulated platforms.
• Don’t rely on AI as a substitute for professional care. If symptoms persist, worsen, or recur, a physical assessment is essential.