Marathon runners seem unlikely victims of the AI revolution, but the use of chatbots as personal trainers has prompted warnings that users could be putting themselves at a higher risk of injury.
Those paying for apps that utilise ChatGPT or other generative AI services can receive daily tips on preparation, putting them on what appears to be a bespoke path to race day.
Yet Chris Beavers, a real-life personal trainer, said these plans are “based on the average of what [AI] thinks is best”, with no “constant feedback” each week.
Chris Beavers
“AI might try and make the programme quite fancy, where it asks people to [go all-out] when, actually, one of the best things for them might just be to pull back on training, or not overdo it, so they don’t get injured and can actually run the marathon,” said Beavers, who runs Father Fit, an online personal training provider.
Many of those preparing for long-distance races have flocked to Runna, which provides AI-generated workout briefings and a structured plan tailored towards the user’s stated goal.
Beavers said another obvious downside of relying on AI was that people often struggle with self-discipline during intense exercise sessions. “When you just have AI trying to keep you accountable you can find it easy not to do it, as it’s just AI and it will give you a placid response,” Beavers said. “When you have to talk to an actual human about it and check in with another person, there’s more of an emotional attachment … you don’t want to let the person down.”
On social media platforms such as TikTok a number of running influencers have attributed injuries to following running plans which were conceived by ChatGPT or Runna.

Nick Berners-Price, 53, the founder of 4D Fitness, said following AI-generated plans was particularly risky for amateur runners. “The number one thing that gets you to the start line and then to the finish without an injury is your biomechanics,” he said. “Unless you’ve got a very sophisticated system in some sports lab, then you can’t analyse this.”
Berners-Price, who trains between 50 and 100 people a year, said AI cannot judge “the way you move” and warned that the “wrong technique” can “obviously lead to injuries pretty quickly”.
He said an important role for a personal trainer is finding out about a client’s lifestyle, including their work hours and diet, and “unless you’re revisiting the AI, constantly feeding it new parameters, then I don’t really see how it improves on a one-size-fits-all programme”.
Nutritionists also fear pre-race diet routines based on AI-generated advice can be harmful. Ella Rauen-Prestes Butler, 51, the founder of Fitbakes, said runners frequently “carb up” too much. “The problem is that if you eat too much at once then you will have a rollercoaster of blood sugar, and AI will always give you the same formula,” she said. “AI can be very dangerous … people used to refer to Dr Google but now it’s Dr ChatGPT.”
Many runners take comfort from being able to leave their running schedules to apps and AI, however. Sophia Parvizi Wayne, 29, a co-founder of Kanjo, a paediatric mental health service, said AI platforms were not “dangerous” but merely “incomplete”.
“Human judgment has to sit in the middle, rather than taking this as an opportunity to reject AI,” she said. “If we are hungover or if our body feels really bad then we have to be able to say to ourselves that we probably shouldn’t [run].”
Runna said: “[Our] plans are designed by experienced coaches using proven training principles. An algorithm then tailors and adapts those coach-designed plans to each runner based on progress, feedback and real-world performance.
“Running, especially long-distance running, is a high-impact sport and injury risk can never be eliminated entirely. Injury risk is influenced by numerous factors including sleep, nutrition, stress, prior injuries and training outside a plan.”