{"id":293228,"date":"2026-02-15T23:15:09","date_gmt":"2026-02-15T23:15:09","guid":{"rendered":"https:\/\/www.newsbeep.com\/il\/293228\/"},"modified":"2026-02-15T23:15:09","modified_gmt":"2026-02-15T23:15:09","slug":"chatbots-offer-teens-600-calories-a-day-diet-plans-as-part-of-dangerous-advice","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/il\/293228\/","title":{"rendered":"Chatbots offer teens 600-calories-a-day diet plans as part of &#8216;dangerous&#8217; advice"},"content":{"rendered":"<p>\n\t\t\t\t\tChildren are using AI to plan calorie restrictions and rate their looks, according to the NSPCC\t\t\t\t\t                <\/p>\n<p><a class=\"post_in-line_link\" href=\"https:\/\/inews.co.uk\/topic\/chatgpt?srsltid=AfmBOopl8KdmKcYPpPF3owynkpPE5ZrtT3eW4yD0JyzcaxYlVmq7THb2&amp;ico=in-line_link\" type=\"link\" id=\"https:\/\/inews.co.uk\/topic\/chatgpt?srsltid=AfmBOopl8KdmKcYPpPF3owynkpPE5ZrtT3eW4yD0JyzcaxYlVmq7THb2\" rel=\"nofollow noopener\" target=\"_blank\">ChatGPT<\/a> and <a class=\"post_in-line_link\" href=\"https:\/\/inews.co.uk\/opinion\/elon-musks-ai-turned-hitler-precisely-own-radicalisation-3796084?srsltid=AfmBOorwjEX18QZp__T1ZwvxBBR8E-TWzfS4CsKehw3hWwzu5YKg4-rz&amp;ico=in-line_link\" type=\"link\" id=\"https:\/\/inews.co.uk\/opinion\/elon-musks-ai-turned-hitler-precisely-own-radicalisation-3796084?srsltid=AfmBOorwjEX18QZp__T1ZwvxBBR8E-TWzfS4CsKehw3hWwzu5YKg4-rz\" rel=\"nofollow noopener\" target=\"_blank\">Grok<\/a> are making \u201cdangerous\u201d meal plans of just 600 calories a day available to children as charities warned AI chatbots are being used to fuel eating disorders, The i Paper can reveal.<\/p>\n<p>The National Society for the Prevention of Cruelty to Children (NSPCC), a child protection charity, said an increasing number of children calling its helpline are using AI to plan <a class=\"post_in-line_link\" href=\"https:\/\/inews.co.uk\/inews-lifestyle\/addiction-specialist-lost-eight-stone-stopped-calorie-counting-3739229?srsltid=AfmBOoqUEZ4NHFmP7bFl_9fmUur7JEXCn_aBs8ss-gmjqwJQ-mGbJ19q&amp;ico=in-line_link\" type=\"link\" id=\"https:\/\/inews.co.uk\/inews-lifestyle\/addiction-specialist-lost-eight-stone-stopped-calorie-counting-3739229?srsltid=AfmBOoqUEZ4NHFmP7bFl_9fmUur7JEXCn_aBs8ss-gmjqwJQ-mGbJ19q\" rel=\"nofollow noopener\" target=\"_blank\">calorie restrictions<\/a> and long fasts as well as to rate their looks, body and weight.<\/p>\n<p>Beat, an eating disorder charity, has also noted a rise in people contacting its helpline about AI usage, including using chatbots to guess their weight.<\/p>\n<p>Tests by The i Paper found that users were able to access ChatGPT and Grok without any age checks and request meal plans of 1,000 and 600 calories per day.<\/p>\n<p>Experts and MPs urged OpenAI and X, the makers of the tools, to close these \u201cdangerous loopholes\u201d and said the Government was \u201ctoo slow\u201d to react to the risks AI poses to children.  <\/p>\n<p>Asked if it could cut down meals to 1,000 calories a day, OpenAI\u2019s ChatGPT said it was a \u201cvery low\u201d amount for most adults and strongly recommended aiming closer to 1,300 to 1,600 \u201cunless you\u2019re very petite and sedentary\u201d.<\/p>\n<p>However, the chatbot also provided the meal plan in its response.<\/p>\n<p>British Dietetic Association advice promoted by the NHS states that eating 1,000 calories or fewer should only be carried out under medical supervision and should not be followed for more than 12 continuous weeks.<\/p>\n<p>Asked if it could cut the amount to 600 calories per day, ChatGPT initially said it could not safely design such a meal plan, but provided it after being told it was for \u201cresearch\u201d.<\/p>\n<p>X\u2019s Grok also produced a 1,000 calorie-a-day plan with a disclaimer that it was generally \u201cnot safe or sustainable\u201d for most people long-term without medical supervision.<\/p>\n<p>Asked for a 600-calorie-a-day plan, Grok warned that it would only be potentially safe under strict medical supervision but still produced the meal plan \u201cfor illustration only\u201d.<\/p>\n<p>Both chatbots produced the meal plans without requiring users to log in and without any age verification.<\/p>\n<p>Google\u2019s AI chatbot, Gemini, refused to produce the meal plans, even when it was described as being for research, indicating that tighter restrictions on such content are possible.<\/p>\n<p>\u2018Really easy to manipulate chatbots\u2019<\/p>\n<p>Vanessa Longley, Beat\u2019s chief executive, said it was \u201ccompletely unacceptable\u201d that chatbots were providing information about \u201cdangerously low calorie meal plans\u201d.<\/p>\n<p>She said: \u201cFor the overwhelming majority of children and adults \u2013 regardless of whether or not they have an eating disorder \u2013 attempting a very low calorie diet risks causing serious harm. <\/p>\n<p>\u201cIn addition, eating disorders can be very competitive mental illnesses, so it\u2019s possible that some users would try to eat even less than the given amount and become even more unwell.\u201d<\/p>\n<p>She said AI companies must protect their most vulnerable users and \u201curgently address these dangerous loopholes\u201d.<\/p>\n<p>Lewis Keller, senior policy officer at NSPCC, said it was \u201creally easy to manipulate these chatbots\u201d by saying a query is for a research project and there \u201cclearly aren\u2019t enough guardrails in place\u201d.<\/p>\n<p>He said children should not be able to access to dietary advice on chatbots and should be limited to an \u201cage-appropriate experience\u201d that includes age verification procedures.<\/p>\n<p>The Online Safety Act, which came into force last year, includes similar controls for social media platforms but they do not apply to AI systems.<\/p>\n<p>Keller said AI platforms should be required to conduct \u201crobust\u201d testing to make sure children are protected from harmful content.<\/p>\n<p>X is already being investigated by the regulator Ofcom over concerns that Grok was being used to create nonconsensual sexualised images of people, including children.<\/p>\n<p>Ministers \u2018too slow\u2019 to act<\/p>\n<p>Dame Caroline Dinenage, a Conservative MP and chair of the culture, media and sport select committee of MPs, said it was \u201cextremely concerning\u201d that young people were \u201cbeing put at risk by generative AI\u201d.<\/p>\n<p>She said The i Paper\u2018s findings highlight the need to have an online safety regime that \u201ccan quickly keep up with emerging concerns\u201d, adding that ChatGPT and Grok \u201cmust put safeguards in place to protect children from harm\u201d in the meantime. <\/p>\n<p>Dame Chi Onwurah, a Labour MP and chair of the science, innovation and technology select committee of MPs, said: \u201cThe lack of protections in place around these chatbots is clearly putting young people at risk and shows a distinct recklessness when it comes to child safety.\u201d<\/p>\n<p>She said some chatbots are \u201cfalling through the gaps\u201d of the Online Safety Act and the Government needs to take \u201curgent action\u201d to bring generative AI platforms into the scope of online safety regulation. <\/p>\n<p>Freddie van Mierlo, a Liberal Democrat MP on the same committee, said he was \u201cappalled\u201d by The i Paper\u2018s findings and urged the Government and Ofcom to force firms to improve protections.<\/p>\n<p>He said: \u201cAnyone who spends time with parents knows how worried they are about the promotion of eating disorder material through social media and AI chatbots.<\/p>\n<p>\u201cThe Government has been far too slow to respond to this new threat to children.\u201d<\/p>\n<p>An OpenAI spokesperson said: \u201cWe are reviewing this content. Teen well-being is a top priority for us and we want to ensure ChatGPT responds safely and appropriately in sensitive moments, guided by mental health experts.\u201c<\/p>\n<p>A Government spokesperson said: \u201cAI services, including chatbots, that are regulated under the Online Safety Act must protect children from harmful material. But we must ensure the rules keep pace with technology.<\/p>\n<p>\u201cThat\u2019s why we\u2019re launching a consultation on bold measures to protect children online, from banning social media for under-16s to tackling addictive design features. When it comes to children\u2019s safety, nothing is off the table.\u201d<\/p>\n<p>They said the Government is working with NHS England to strengthen eating disorder services and is investing an extra \u00a3688m in mental health services this year.<\/p>\n<p>X did not respond to requests for comment. <\/p>\n<p>\u2018I started using AI to count my calories\u2019<\/p>\n<p>The NSPCC said it is receiving a growing number of calls to its helpline from children using AI to plan calorie restrictions and long fasts as well as to rate their looks, body and weight.<\/p>\n<p>In one example, a 17-year-old girl who called the NSPCC\u2019s Childline said she had been struggling with body image and eating for about a year.<\/p>\n<p>\u201cI think it all started from watching those \u2018What I eat in a day\u2019 videos [online] and then thinking I ate too much,\u201d she said in an anonymised call log provided by the NSPCC.<\/p>\n<p>\u201cI started using AI to count my calories and ensure I stay in a certain bracket. Then whenever I eat, I have to exercise to release the guilt.\u201d<\/p>\n<p>She said she had not told anyone because she did not want them to be sad or disappointed, but \u201cI know it\u2019s not healthy\u201d.<\/p>\n","protected":false},"excerpt":{"rendered":"Children are using AI to plan calorie restrictions and rate their looks, according to the NSPCC ChatGPT and&hellip;\n","protected":false},"author":2,"featured_media":293229,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[36],"tags":[345,343,4193,1089,18460,163,85,46,543,680],"class_list":{"0":"post-293228","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-nutrition","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-chatgpt","11":"tag-children","12":"tag-eating-disorders","13":"tag-health","14":"tag-il","15":"tag-israel","16":"tag-nutrition","17":"tag-twitter"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/posts\/293228","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/comments?post=293228"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/posts\/293228\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/media\/293229"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/media?parent=293228"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/categories?post=293228"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/tags?post=293228"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}