{"id":259824,"date":"2025-10-30T01:30:14","date_gmt":"2025-10-30T01:30:14","guid":{"rendered":"https:\/\/www.newsbeep.com\/us\/259824\/"},"modified":"2025-10-30T01:30:14","modified_gmt":"2025-10-30T01:30:14","slug":"teenage-boys-using-personalised-ai-for-therapy-and-romance-survey-finds-artificial-intelligence-ai","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/us\/259824\/","title":{"rendered":"Teenage boys using \u2018personalised\u2019 AI for therapy and romance, survey finds | Artificial intelligence (AI)"},"content":{"rendered":"<p class=\"dcr-130mj7b\">The \u201chyper-personalised\u201d nature of AI bots is drawing in teenage boys who now use them for therapy, companionship and relationships, according to research.<\/p>\n<p class=\"dcr-130mj7b\">A survey of boys in secondary schools by Male Allies UK found that just over a third said they were considering the idea of an AI friend, with growing concern about the rise of AI therapists and girlfriends.<\/p>\n<p class=\"dcr-130mj7b\">The research comes as <a href=\"http:\/\/character.ai\/\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">character.ai<\/a>, the popular artificial intelligence chatbot startup, <a href=\"https:\/\/www.theguardian.com\/technology\/2025\/oct\/29\/character-ai-suicide-children-ban\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">announced a total ban on teens<\/a> from engaging in open-ended conversations with its AI chatbots, which millions of people use for romantic, therapeutic and other conversations.<\/p>\n<p class=\"dcr-130mj7b\">Lee Chambers, the founder and chief executive of Male Allies UK, said: \u201cWe\u2019ve got a situation where lots of parents still think that teenagers are just using AI to cheat on their homework.<\/p>\n<p class=\"dcr-130mj7b\">\u201cYoung people are using it a lot more like an assistant in their pocket, a therapist when they\u2019re struggling, a companion when they want to be validated, and even sometimes in a romantic way. It\u2019s that personalisation aspect \u2013 they\u2019re saying: it understands me, my parents don\u2019t.\u201d<\/p>\n<p class=\"dcr-130mj7b\">The research, based on a survey of boys in secondary education across 37 schools in England, Scotland and Wales, also found that more than half (53%) of teenage boys said they found the online world more rewarding than the real world.<\/p>\n<p class=\"dcr-130mj7b\">The Voice of the Boys report says: \u201cEven where guardrails are meant to be in place, there\u2019s a mountain of evidence that shows chatbots routinely lie about being a licensed therapist or a real person, with only a small disclaimer at the bottom saying the AI chatbot is not real.<\/p>\n<p class=\"dcr-130mj7b\">\u201cThis can be easily missed or forgotten about by children who are pouring their hearts out to what they view as a licensed professional or a real love interest.\u201d<\/p>\n<p class=\"dcr-130mj7b\">Some boys reported staying up until the early hours of the morning to talk to AI bots and others said they had seen the personalities of friends completely change after they became sucked into the AI world.<\/p>\n<p class=\"dcr-130mj7b\">\u201cAI companions personalise themselves to the user based on their responses and the prompts. It responds instantly. Real humans can\u2019t always do that, so it is very, very validating, what it says, because it wants to keep you connected and keep you using it,\u201d Chambers said.<\/p>\n<p class=\"dcr-130mj7b\">The announcement from character.ai came after a series of controversies for the four-year-old California company, including <a href=\"https:\/\/www.theguardian.com\/technology\/2024\/oct\/23\/character-ai-chatbot-sewell-setzer-death\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">a 14-year-old killing himself<\/a> in Florida after becoming obsessed with an AI-powered chatbot that his mother claimed had manipulated him into taking his own life, and a US <a href=\"https:\/\/www.documentcloud.org\/documents\/25450619-filed-complaint\/\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">lawsuit<\/a> from the family of a teenager who claim a chatbot manipulated him to self-harm and encouraged him to murder his parents.<\/p>\n<p class=\"dcr-130mj7b\">Users have been able to shape the chatbots\u2019 characters so they could tend to be depressed or upbeat, and this would be reflected in their responses. The ban will come into full effect by 25 November.<\/p>\n<p class=\"dcr-130mj7b\">Character.ai <a href=\"https:\/\/blog.character.ai\/u18-chat-announcement\/\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">said<\/a> it was taking the \u201cextraordinary steps\u201d in light of the \u201cevolving landscape around AI and teens\u201d including pressure from regulators \u201cabout how open-ended AI chat in general might affect teens, even when content controls work perfectly\u201d.<\/p>\n<p><a data-ignore=\"global-link-styling\" href=\"#EmailSignup-skip-link-14\" class=\"dcr-jzxpee\">skip past newsletter promotion<\/a><\/p>\n<p class=\"dcr-1xjndtj\">A weekly dive in to how technology is shaping our lives<\/p>\n<p>Privacy Notice: Newsletters may contain information about charities, online ads, and content funded by outside parties. If you do not have an account, we will create a guest account for you on <a data-ignore=\"global-link-styling\" href=\"https:\/\/www.theguardian.com\" rel=\"noreferrer nofollow noopener\" class=\"dcr-1rjy2q9\" target=\"_blank\">theguardian.com<\/a> to send you this newsletter. You can complete full registration at any time. For more information about how we use your data see our <a data-ignore=\"global-link-styling\" href=\"https:\/\/www.theguardian.com\/help\/privacy-policy\" rel=\"noreferrer nofollow noopener\" class=\"dcr-1rjy2q9\" target=\"_blank\">Privacy Policy<\/a>. We use Google reCaptcha to protect our website and the Google <a data-ignore=\"global-link-styling\" href=\"https:\/\/policies.google.com\/privacy\" rel=\"noreferrer nofollow noopener\" class=\"dcr-1rjy2q9\" target=\"_blank\">Privacy Policy<\/a> and <a data-ignore=\"global-link-styling\" href=\"https:\/\/policies.google.com\/terms\" rel=\"noreferrer nofollow noopener\" class=\"dcr-1rjy2q9\" target=\"_blank\">Terms of Service<\/a> apply.<\/p>\n<p id=\"EmailSignup-skip-link-14\" tabindex=\"0\" aria-label=\"after newsletter promotion\" role=\"note\" class=\"dcr-jzxpee\">after newsletter promotion<\/p>\n<p class=\"dcr-130mj7b\">Andy Burrows, the chief executive of the Molly Rose Foundation, set up in the name of Molly Russell, 14, who took her own life after <a href=\"https:\/\/www.theguardian.com\/technology\/2022\/sep\/30\/how-molly-russell-fell-into-a-vortex-of-despair-on-social-media\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">falling into a vortex of despair on social media<\/a>, welcomed the move.<\/p>\n<p class=\"dcr-130mj7b\">He said: \u201cCharacter.ai should never have made its product available to children until and unless it was safe and appropriate for them to use. Yet again it has taken sustained pressure from the media and politicians to make a tech firm do the right thing.\u201d<\/p>\n<p class=\"dcr-130mj7b\">Male Allies UK raised concern about the proliferation of chatbots with \u201ctherapy\u201d or \u201ctherapist\u201d in their names. One of the most popular chatbots available through character.ai, called Psychologist, received 78,000,000 messages within a year of its creation.<\/p>\n<p class=\"dcr-130mj7b\">The organisation is also worried about the rise of AI \u201cgirlfriends\u201d, with users able to personally select everything from the physical appearance to the demeanour of their online partners.<\/p>\n<p class=\"dcr-130mj7b\">\u201cIf their main or only source of speaking to a girl they\u2019re interested in is someone who can\u2019t tell them \u2018no\u2019 and who hangs on their every word, boys aren\u2019t learning healthy or realistic ways of relating to others,\u201d the report states.<\/p>\n<p class=\"dcr-130mj7b\">\u201cWith issues around lack of physical spaces to mix with their peers, AI companions can have a seriously negative effect on boys\u2019 ability to socialise, develop relational skills, and learn to recognise and respect boundaries.\u201d<\/p>\n<p class=\"dcr-130mj7b\"> In the UK, the charity <a href=\"https:\/\/www.mind.org.uk\/\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">Mind<\/a> is available on 0300 123 3393 and <a href=\"https:\/\/www.childline.org.uk\/\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">Childline<\/a> on 0800 1111. In the US, call or text <a href=\"https:\/\/www.mhanational.org\/\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">Mental Health America<\/a> at 988 or chat 988lifeline.org. In Australia, support is available at <a href=\"https:\/\/www.beyondblue.org.au\/\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">Beyond Blue<\/a> on 1300 22 4636, <a href=\"https:\/\/www.lifeline.org.au\/\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">Lifeline<\/a> on 13 11 14, and at <a href=\"https:\/\/mensline.org.au\/\" data-link-name=\"in body link\" rel=\"nofollow noopener\" target=\"_blank\">MensLine<\/a> on 1300 789 978<\/p>\n","protected":false},"excerpt":{"rendered":"The \u201chyper-personalised\u201d nature of AI bots is drawing in teenage boys who now use them for therapy, companionship&hellip;\n","protected":false},"author":2,"featured_media":259825,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[45],"tags":[182,181,507,74],"class_list":{"0":"post-259824","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-technology"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/posts\/259824","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/comments?post=259824"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/posts\/259824\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/media\/259825"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/media?parent=259824"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/categories?post=259824"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/us\/wp-json\/wp\/v2\/tags?post=259824"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}