{"id":113476,"date":"2025-11-03T18:28:14","date_gmt":"2025-11-03T18:28:14","guid":{"rendered":"https:\/\/www.newsbeep.com\/il\/113476\/"},"modified":"2025-11-03T18:28:14","modified_gmt":"2025-11-03T18:28:14","slug":"ai-companions-could-serve-as-prosthetic-relationships-for-the-lonely","status":"publish","type":"post","link":"https:\/\/www.newsbeep.com\/il\/113476\/","title":{"rendered":"AI companions could serve as \u2018prosthetic relationships\u2019 for the lonely"},"content":{"rendered":"<p>At a dinner with friends not long ago, we did what aging friends often do: compared medical notes \u2014 statins, arthritis injections, upcoming scans. Then someone added, almost offhand: \u201cHave you tried the new AI? It\u2019s better than the old one.\u201d<\/p>\n<p>The contrast was brutal: Our bodies are failing \u2014 memory lapses, slower gaits, surgeries piling up. The machine, meanwhile, keeps improving. For us, there is only one direction left to go. For it, each version promises more fluency, more reach, more permanence.<\/p>\n<p>And this isn\u2019t just novelty. People use artificial intelligence to draft letters, ease stress, rehearse hard conversations \u2014 even to keep them company. A Common Sense Media <a href=\"https:\/\/www.commonsensemedia.org\/sites\/default\/files\/research\/report\/talk-trust-and-trade-offs_2025_web.pdf\" target=\"_blank\" rel=\"noopener nofollow\">survey<\/a> found that more than 70% of U.S. teens have tried AI companions, and a third report finding them as satisfying as real friendships. <a href=\"https:\/\/www.theverge.com\/24216748\/replika-ceo-eugenia-kuyda-ai-companion-chatbots-dating-friendship-decoder-podcast-interview\" target=\"_blank\" rel=\"noopener nofollow\">Replika<\/a> alone has tens of millions of users, many describing emotional or romantic ties. In quiet ways, AI is already helping people think, cope, and connect. These tools are beginning to function as prosthetic relationships \u2014 not replacing intimacy, but supporting the emotional and cognitive work it requires.<\/p>\n<p>Relationships are among life\u2019s most sacred elements \u2014 the heart of religion, the foundation of parenting, the core of community. But because they matter so deeply, we must also consider those cut off from them not by choice but by biology, trauma, or the limits of treatment. For these people, a prosthetic relationship isn\u2019t a substitute for intimacy, but a way to approximate it with support, dignity, and hope.<\/p>\n<p>That raises a question health care hasn\u2019t yet faced: If people can form meaningful bonds with machines, should those bonds be recognized as legitimate supports \u2014 especially for people unable to sustain relationships despite years of treatment?<\/p>\n<p>\t\t\t<img decoding=\"async\" width=\"768\" height=\"432\" src=\"https:\/\/www.newsbeep.com\/il\/wp-content\/uploads\/2025\/11\/AdobeStock_1694212676-768x432.jpeg\" class=\"attachment-article-main-medium-large size-article-main-medium-large\" alt=\"\" loading=\"lazy\"  \/>\t\t<\/p>\n<p>\t\t\t\t\t\t<a href=\"https:\/\/www.statnews.com\/2025\/10\/29\/chatbots-doctors-guide-medical-appointments-questions\/\" rel=\"nofollow noopener\" target=\"_blank\">Doctors need to ask patients about chatbots<\/a><\/p>\n<p>Loneliness carries health risks as serious as smoking or obesity. In 2023, the <a href=\"https:\/\/www.hhs.gov\/sites\/default\/files\/surgeon-general-social-connection-advisory.pdf\" target=\"_blank\" rel=\"noopener nofollow\">surgeon general<\/a> called it a public health epidemic. For most, the best treatment is simple: human connection. But what about those who, due to chronic mental health conditions or developmental barriers, struggle to form or sustain those ties even after years of care?<\/p>\n<p>Loneliness is often a symptom, not the root. For many, the deeper issue is a persistent difficulty interpreting or tolerating social contact \u2014 even when it\u2019s available. The problem isn\u2019t just isolation, but impaired capacity for connection. Addressing that requires more than presence. It requires a support that meets them where they can engage.<\/p>\n<p>I spent a decade as a psychotherapist, and decades more supervising clinicians and building programs \u2014 from outpatient clinics to assertive community treatment, supportive housing, and job coaching. That range showed me both the possibilities of care and the limits we haven\u2019t yet bridged. These essential services are labor-intensive, limited, and rarely available around the clock. Even at their best, they can\u2019t provide the steady, judgment-free presence some people need every day.<\/p>\n<p>Digital tools like <a href=\"https:\/\/pubmed.ncbi.nlm.nih.gov\/24015913\/\" target=\"_blank\" rel=\"noopener nofollow\">FOCUS<\/a> (a schizophrenia self-management app) and <a href=\"https:\/\/www.researchprotocols.org\/2016\/2\/e77\/\" target=\"_blank\" rel=\"noopener nofollow\">PRIME<\/a> (a motivational app for early psychosis) show promise for patient populations, but they lack the depth and dependability many users need.<\/p>\n<p>The prosthetic analogy<\/p>\n<p>We already accept prostheses for body and mind. A prosthetic leg doesn\u2019t restore a limb; it enables walking. A hearing aid doesn\u2019t cure deafness; it supports participation. AI is not yet medical grade: It \u201challucinates,\u201d and using it for prosthetic relationships would require the same safeguards we demand of insulin pumps or pacemakers.<\/p>\n<p>Prosthetic relationships are not for everyone who feels lonely. They\u2019re for people with persistent relational impairments despite adequate treatment \u2014 including those with treatment-resistant depression, complex trauma, personality disorders, <a href=\"https:\/\/pubmed.ncbi.nlm.nih.gov\/32851204\/\" target=\"_blank\" rel=\"noopener nofollow\">autistic burnout<\/a>, social anxiety, serious mental illness, or longstanding social challenges for any reason. For them, a well-designed therapeutic AI companion prescribed by licensed mental health professional could act as an adaptive device: not replacing connection and relationships, but making them tolerable, reinforcing them or holding space until more is possible.Perhaps someday, just as audiologists fit hearing aids, there will be specialists to fit each patient with the right AI relationship prosthesis for them.<\/p>\n<p>Exact prevalence data aren\u2019t available, but even a narrow subset of this group \u2014 people with serious mental illness \u2014 likely includes <a href=\"https:\/\/www.samhsa.gov\/data\/sites\/default\/files\/reports\/rpt47095\/National%20Report\/National%20Report\/2023-nsduh-annual-national.pdf\" target=\"_blank\" rel=\"noopener nofollow\">millions<\/a> who live with persistent relational challenges that resist standard care. That\u2019s a population large enough to demand attention, and deserving of support. AI chatbots aren\u2019t digital Band-Aids, but potential tools for those whose needs exceed what traditional care can provide. Like any prosthetic, these systems would require fitting, supervision, and clinical judgment.<\/p>\n<p>Stability over friction<\/p>\n<p>Human relationships carry conflict and emotional complexity. Most of us learn to manage that messiness. For some though, it\u2019s overwhelming. A prosthetic relationship could offer a reliable anchor \u2014 steady and responsive \u2014 while reinforcing reality testing, self-regulation, and psychoeducation.<\/p>\n<p>Beyond easing loneliness, these systems could provide ongoing coaching \u2014 practicing social skills, modeling constructive communication, and guiding symptom management in real time. For these users, stability may be more therapeutic than authenticity.<\/p>\n<p>Function over form<\/p>\n<p>\t\t\t<img decoding=\"async\" width=\"768\" height=\"432\" src=\"https:\/\/www.newsbeep.com\/il\/wp-content\/uploads\/2025\/11\/AdobeStock_751405025-768x432.jpeg\" class=\"attachment-article-main-medium-large size-article-main-medium-large\" alt=\"\" loading=\"lazy\"  \/>\t\t<\/p>\n<p>\t\t\t\t<img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.statnews.com\/wp-content\/themes\/stat\/images\/home\/statplus.svg\" width=\"19\" height=\"16\" alt=\"\"\/><br \/>\n\t\t\t\t<a href=\"https:\/\/www.statnews.com\/2025\/10\/29\/ai-psychosis-mental-health-chatbots\/\" rel=\"nofollow noopener\" target=\"_blank\">STAT Plus: \u2018AI psychosis\u2019 discussions ignore a bigger problem with chatbots<\/a><\/p>\n<p>The real test is not whether these relationships feel conventional, but whether they help people function. If a prosthetic tie allows someone to work, care for others, or show up in community, then it has done its job.<\/p>\n<p>A future patient might look something like this: a 45-year-old executive, a former Marine whose battlefield discipline helped him excel in business but left scars he\u2019s never fully shaken. Years of therapy brought little relief, and antidepressants impaired his performance. He now relies on a female avatar \u2014 steady, kind, and unfailingly constructive \u2014 who guides him through conflicts at work and home, even coaching him on board reports. She encourages him to sustain real-world ties while offering stability he finds nowhere else. Though he keeps this hidden, the relationship anchors him, and together they periodically assess his readiness to risk intimacy again. For him, the AI is not merely a surrogate for love, but a stabilizing support \u2014 one that helps him keep showing up while he heals.<\/p>\n<p>How it could work<\/p>\n<p>If health care treated prosthetic relationships like other devices, three principles would guide their use:<\/p>\n<p>Eligibility: For people with long-standing relational impairments unresponsive to standard treatments, and certified by clinicians as likely to benefit.<\/p>\n<p>Safeguards: Tiered models could range from light daily support to higher-dependency ties, with informed consent and regular review for risks like isolation or overuse. All systems would require certification as medical-grade.<\/p>\n<p>Parity: If insurers cover wheelchairs and hearing aids, why not this? Coverage could be tied to measurable gains in work, caregiving, or social participation.<\/p>\n<p>The first generation could rely on stable, auditable text \u2014 with future versions extending to phones, wearables, or earpieces. Tone matters: Some need warmth, others restraint. Customizable voices or avatars \u2014 professional, friendly, or playful \u2014 could make prosthetic relationships safer and more effective.<\/p>\n<p>Accuracy is just as crucial. A mental health support system must not fabricate or reinforce delusions. Medical-grade AI must acknowledge uncertainty, flag errors, and avoid presenting low-probability guesses as fact \u2014 especially when emotional safety is at stake.<\/p>\n<p>Regulators are starting to respond. In September, the <a href=\"https:\/\/www.ftc.gov\/news-events\/news\/press-releases\/2025\/09\/ftc-launches-inquiry-ai-chatbots-acting-companions#:~:text=The%20Federal%20Trade%20Commission%20is,FTC%20Chairman%20Andrew%20N.%20Ferguson.\" target=\"_blank\" rel=\"noopener nofollow\">Federal Trade Commission<\/a> opened an inquiry into whether AI companions expose youth to harm. The American Psychological Association launched a <a href=\"https:\/\/www.apa.org\/news\/press\/releases\/2025\/07\/labs-digital-badge-program\" target=\"_blank\" rel=\"noopener nofollow\">Digital Badge Program<\/a> to certify tools that meet clinical and privacy standards. These are early but essential steps. They signal that prosthetic relationships are not a science-fiction idea. They are arriving now. Health systems and insurers should begin treating prosthetic relationships as a legitimate branch of cognitive support with oversight and measurable outcomes. Not as replacements for intimacy, but as provisional supports \u2014 until or unless better therapies arrive.<\/p>\n<p>The question is no longer whether AI can become a friend. It\u2019s whether it can become a dependable support \u2014 strong enough to keep someone connected when nothing else will.<\/p>\n<p>Harvey Lieberman, Ph.D., is a clinical psychologist and consultant who has led major mental health programs and now writes on the intersection of care and technology.<\/p>\n","protected":false},"excerpt":{"rendered":"At a dinner with friends not long ago, we did what aging friends often do: compared medical notes&hellip;\n","protected":false},"author":2,"featured_media":113477,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[34],"tags":[343,163,521,85,46,522],"class_list":{"0":"post-113476","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-healthcare","8":"tag-artificial-intelligence","9":"tag-health","10":"tag-healthcare","11":"tag-il","12":"tag-israel","13":"tag-mental-health"},"_links":{"self":[{"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/posts\/113476","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/comments?post=113476"}],"version-history":[{"count":0,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/posts\/113476\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/media\/113477"}],"wp:attachment":[{"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/media?parent=113476"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/categories?post=113476"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.newsbeep.com\/il\/wp-json\/wp\/v2\/tags?post=113476"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}