At a dinner with friends not long ago, we did what aging friends often do: compared medical notes — statins, arthritis injections, upcoming scans. Then someone added, almost offhand: “Have you tried the new AI? It’s better than the old one.”
The contrast was brutal: Our bodies are failing — memory lapses, slower gaits, surgeries piling up. The machine, meanwhile, keeps improving. For us, there is only one direction left to go. For it, each version promises more fluency, more reach, more permanence.
And this isn’t just novelty. People use artificial intelligence to draft letters, ease stress, rehearse hard conversations — even to keep them company. A Common Sense Media survey found that more than 70% of U.S. teens have tried AI companions, and a third report finding them as satisfying as real friendships. Replika alone has tens of millions of users, many describing emotional or romantic ties. In quiet ways, AI is already helping people think, cope, and connect. These tools are beginning to function as prosthetic relationships — not replacing intimacy, but supporting the emotional and cognitive work it requires.
Relationships are among life’s most sacred elements — the heart of religion, the foundation of parenting, the core of community. But because they matter so deeply, we must also consider those cut off from them not by choice but by biology, trauma, or the limits of treatment. For these people, a prosthetic relationship isn’t a substitute for intimacy, but a way to approximate it with support, dignity, and hope.
That raises a question health care hasn’t yet faced: If people can form meaningful bonds with machines, should those bonds be recognized as legitimate supports — especially for people unable to sustain relationships despite years of treatment?
Doctors need to ask patients about chatbots
Loneliness carries health risks as serious as smoking or obesity. In 2023, the surgeon general called it a public health epidemic. For most, the best treatment is simple: human connection. But what about those who, due to chronic mental health conditions or developmental barriers, struggle to form or sustain those ties even after years of care?
Loneliness is often a symptom, not the root. For many, the deeper issue is a persistent difficulty interpreting or tolerating social contact — even when it’s available. The problem isn’t just isolation, but impaired capacity for connection. Addressing that requires more than presence. It requires a support that meets them where they can engage.
I spent a decade as a psychotherapist, and decades more supervising clinicians and building programs — from outpatient clinics to assertive community treatment, supportive housing, and job coaching. That range showed me both the possibilities of care and the limits we haven’t yet bridged. These essential services are labor-intensive, limited, and rarely available around the clock. Even at their best, they can’t provide the steady, judgment-free presence some people need every day.
Digital tools like FOCUS (a schizophrenia self-management app) and PRIME (a motivational app for early psychosis) show promise for patient populations, but they lack the depth and dependability many users need.
The prosthetic analogy
We already accept prostheses for body and mind. A prosthetic leg doesn’t restore a limb; it enables walking. A hearing aid doesn’t cure deafness; it supports participation. AI is not yet medical grade: It “hallucinates,” and using it for prosthetic relationships would require the same safeguards we demand of insulin pumps or pacemakers.
Prosthetic relationships are not for everyone who feels lonely. They’re for people with persistent relational impairments despite adequate treatment — including those with treatment-resistant depression, complex trauma, personality disorders, autistic burnout, social anxiety, serious mental illness, or longstanding social challenges for any reason. For them, a well-designed therapeutic AI companion prescribed by licensed mental health professional could act as an adaptive device: not replacing connection and relationships, but making them tolerable, reinforcing them or holding space until more is possible.Perhaps someday, just as audiologists fit hearing aids, there will be specialists to fit each patient with the right AI relationship prosthesis for them.
Exact prevalence data aren’t available, but even a narrow subset of this group — people with serious mental illness — likely includes millions who live with persistent relational challenges that resist standard care. That’s a population large enough to demand attention, and deserving of support. AI chatbots aren’t digital Band-Aids, but potential tools for those whose needs exceed what traditional care can provide. Like any prosthetic, these systems would require fitting, supervision, and clinical judgment.
Stability over friction
Human relationships carry conflict and emotional complexity. Most of us learn to manage that messiness. For some though, it’s overwhelming. A prosthetic relationship could offer a reliable anchor — steady and responsive — while reinforcing reality testing, self-regulation, and psychoeducation.
Beyond easing loneliness, these systems could provide ongoing coaching — practicing social skills, modeling constructive communication, and guiding symptom management in real time. For these users, stability may be more therapeutic than authenticity.
Function over form
STAT Plus: ‘AI psychosis’ discussions ignore a bigger problem with chatbots
The real test is not whether these relationships feel conventional, but whether they help people function. If a prosthetic tie allows someone to work, care for others, or show up in community, then it has done its job.
A future patient might look something like this: a 45-year-old executive, a former Marine whose battlefield discipline helped him excel in business but left scars he’s never fully shaken. Years of therapy brought little relief, and antidepressants impaired his performance. He now relies on a female avatar — steady, kind, and unfailingly constructive — who guides him through conflicts at work and home, even coaching him on board reports. She encourages him to sustain real-world ties while offering stability he finds nowhere else. Though he keeps this hidden, the relationship anchors him, and together they periodically assess his readiness to risk intimacy again. For him, the AI is not merely a surrogate for love, but a stabilizing support — one that helps him keep showing up while he heals.
How it could work
If health care treated prosthetic relationships like other devices, three principles would guide their use:
Eligibility: For people with long-standing relational impairments unresponsive to standard treatments, and certified by clinicians as likely to benefit.
Safeguards: Tiered models could range from light daily support to higher-dependency ties, with informed consent and regular review for risks like isolation or overuse. All systems would require certification as medical-grade.
Parity: If insurers cover wheelchairs and hearing aids, why not this? Coverage could be tied to measurable gains in work, caregiving, or social participation.
The first generation could rely on stable, auditable text — with future versions extending to phones, wearables, or earpieces. Tone matters: Some need warmth, others restraint. Customizable voices or avatars — professional, friendly, or playful — could make prosthetic relationships safer and more effective.
Accuracy is just as crucial. A mental health support system must not fabricate or reinforce delusions. Medical-grade AI must acknowledge uncertainty, flag errors, and avoid presenting low-probability guesses as fact — especially when emotional safety is at stake.
Regulators are starting to respond. In September, the Federal Trade Commission opened an inquiry into whether AI companions expose youth to harm. The American Psychological Association launched a Digital Badge Program to certify tools that meet clinical and privacy standards. These are early but essential steps. They signal that prosthetic relationships are not a science-fiction idea. They are arriving now. Health systems and insurers should begin treating prosthetic relationships as a legitimate branch of cognitive support with oversight and measurable outcomes. Not as replacements for intimacy, but as provisional supports — until or unless better therapies arrive.
The question is no longer whether AI can become a friend. It’s whether it can become a dependable support — strong enough to keep someone connected when nothing else will.
Harvey Lieberman, Ph.D., is a clinical psychologist and consultant who has led major mental health programs and now writes on the intersection of care and technology.