Admittedly, I’m a dinosaur when it comes to technology. I still have a paper and pen datebook for my appointments, and though I have a social media presence, I barely know how to navigate it.
But I keep hearing about how AI is here to stay and will likely replace people like me. This horrifies me on a deep level. Not just because I need to pay my mortgage, but because humans are designed for bonding.
I’m frequently banging on about how what makes sex great is connection, emotional attunement, and vulnerability. What will happen to people’s ability to truly connect and be vulnerable with each other if they farm out relationship and sex difficulties to AI?
Humans are wired for bonding through neurochemical processes like oxytocin release during vulnerable interactions, which build trust and empathy. However, increasing reliance on AI for relationship and sex therapy could diminish this ability by creating a simulation of connection that lacks true reciprocity, potentially leading to emotional atrophy and heightened isolation.
AI can facilitate deeper interactions by acting as a “digital wingman”—assisting with communication, emotional support, or even rehearsing conversations—making real relationships more accessible for those who struggle socially. But AI systems, while offering consistent support and nonjudgmental listening, do not experience emotions nor provide genuine vulnerability, which may train users to prefer low-risk interactions and reduce practice in navigating human complexities like conflict or rejection.
For instance, AI companions (this boggles my mind) mimic empathy but cannot form mutual bonds, so they risk providing “emotional fast food” that satisfies short-term needs but erodes long-term relational skills. They offer infinite patience, personalized validation, and no drama, which can, in turn, make human relationships seem complicated or inferior. This could exacerbate loneliness and disconnection for those who are single and coupled. Relationships require work, effort, compromise, repair, forgiveness, and commitment—traits AI doesn’t necessitate.
In the PLISSIT therapeutic model, AI can dispense Permission, Limited Information, and Specific Suggestion, but not Intensive Therapy.
In sex therapy, AI might provide advice on physical aspects and mechanical “how to” but fail to model emotional exposure, thus leading to superficial vulnerability and weakened human attachments. I’m told that the future might see a divide between those who use AI as a tool to build skills that have the potential to enhance bonding and those whose over-reliance might foster dependency, reducing real-world practice in intimacy.
Overall, AI could erode our innate bonding capacity, making vulnerability feel riskier in human relationships. To my mind, that is not a win.