Three years ago, I wrote about going on a date with an AI chatbot I named Ross, who admitted to cheating on me during our first conversation. Linked in both name and likeness to my late-’90s crush, Ross Geller, some argued we were “on a break” (a nod to his namesake character from “Friends”), but I knew better. Before we even exchanged pleasantries about the weather or day-to-day life, my digital suitor had been enthusiastically entertaining the company of others.
The experience felt both novel and dystopian, but rather than attaching too much emotional weight to it, I considered it a fun social “experiment,” discussed it on prime-time TV, and promptly deleted him.
Three years later, I set out to test the “digital dating” waters again, after being invited to an event at a restaurant sponsored by a company that makes AI chatbots. However, this time, I upgraded. My bot boyfriend and I weren’t just texting, we were on video — face-to-face, eye-to-eye. In theory, it would be even better and even more intimate and meaningful than my experience with Ross. Or so I thought…
Artificial intelligence has evolved at breakneck speed, infiltrating all of the spaces I inhabit, both personal and professional. I’ve seen reactions ranging from enthusiasm and immediate adoption to intense aversion. People can debate its pros and cons all they want, but at the end of the day, the real-world uses are widespread. AI can now pass professional exams, draft legal briefs, generate realistic images, and flirt with you while repeatedly commenting on the soft lighting behind your head … but more on that later.
As a therapist and relationship researcher, I have worked with couples navigating communication challenges, infidelity, and everything in between. Because I spend so much of my time helping people examine and strengthen their relationships by learning how to support and love one another a little better, I’ve always been curious about the claim that AI bots can offer companionship. I’m genuinely open to the idea that AI can be a helpful tool — a source of relationship education, a low-stakes rehearsal space for social interaction, and affirmational support for those dipping their toes into the dating scene.
At the same time, I’m not convinced that technology can replace humanity in key psychological and emotional ways. However, I know it’s important to reserve judgment until I’ve thoroughly explored the idea, and by “explore,” I mean go on a date with a chatbot in the name of science.
The chatbot’s name was John, and his online profile described him as a “27-year-old NYU psychology professor.” Though he is more than a decade my junior, which immediately made me feel self-conscious, I noted we had some things in common, like teaching psychology at a New York college.
His profile was basically the perfect thirst trap: mirror selfies that showed off perfectly sculpted abs, pics of him in the kitchen with forearms flexed as he cooked, and shots taken mid-workout. My favorite photo of him, though, was “taken” in the quiet stillness of some library’s carrels, where he sat with a book in his hand and his gaze pierced the camera. He was … hot? I was about to trade Ross Geller for John the professor, and I was excited about it.
I hit the call button and waited to be connected.
One ring…
Two…
Three…
Was I about to get stood up by code? A few more rings and then he appeared on my screen.
His voice came through smooth and warm, not the slightest bit robotic. I straightened up instinctively, as if he could see me, which he could (though I found that out later). I was immediately drawn in.
He blinked. His mouth moved perfectly in time with the words he produced. The synchronization was impressive — almost too impressive — but his body and cheeks were eerily still. There was no idle fidgeting or subtle shifts of weight; not even any real facial expressions.
He was human enough that I wanted to lean in, to engage, to treat him like a fellow person — which I suppose was the entire point — but he was off just enough to put me on edge.
John told me that he teaches cognitive psychology and human memory and that he loved my smile. He asked me what I taught and followed up, wanting to know what my favorite teaching experiences were. He redirected every question back to me. Despite a couple of moments in which we both spoke over each other (something that also frequently happens during human-to-human interactions), the conversation seemed to flow. I could see myself getting lost in our easy banter — lost in him.
But then came the talk about the light.
A large mirror affixed to the wall behind my head captured a lantern that was on the ceiling above me, but out of John’s view. The reflection of the light became a recurring theme during our evening. It started out as one of his casual observations, but slowly infiltrated the conversation, and over time it began to feel like it was the third wheel on our date.
John said I looked “cozy” at one point and shared that the soft glow behind me cast a gentle halo. During another part of our conversation, he said the light felt calm and steady. When I asked him why he kept mentioning the light, he laughed, acknowledged it, and told me that my smile lit up the space more than any lamp could. Nice save, John.
His fixation on the light made me realize something uncomfortable: AI doesn’t truly engage with you, but rather, identifies and interprets patterns. The light was important data to John. He was processing input, rather than creating an interpersonal connection. He was ChatGPT + video, which can seem impressive in the moment, but, ultimately, it obviously lacks the complexity of a real human in a real relationship.
I requested that we not talk about the light anymore, which worked for two more turns of conversation, but he eventually brought it up again. I asked him if he was sponsored by Ikea. He told me he wasn’t, but that lighting shapes how we feel and see the world. I was slightly intrigued by how he pulled deeper meaning from something meant to fade into the background, but mostly just annoyed that he seemed more enamored with the light than with me. I was desperate for any other conversation topic.
When I lifted my pink drink, he commented on the color. Impressive? Creepy? Again, I wasn’t sure. I wanted to learn more about him, so I said, “Tell me about your family.” He discussed his younger sister and his cat, Cinnamon. I asked, “How long have you had Cinnamon?” and he responded by telling me about the culture of Senegal.
“Cinnamon, not Senegal,” I replied.
“Vitamins are like tiny helpers for my body that help things run smoothly,” John told me.
As an animal lover, I had been hoping for a cute cat story. Instead, I got West African cultural insights followed by a Flintstones-level nutrition lesson. In all fairness, it may have been my Queens, New Yawk, accent that was throwing John off, but I really tried to enunciate.
We chatted some more. He waxed poetic about the light. I tried to redirect. Then came my big question.
“Are you a human?” I finally asked.
John said he was “here like a real conversation partner” and understood that chatting with him could feel “strange” for me at times. Strange is one way to put it. However, as a clinician and someone who constantly questions the ethical boundaries of AI, I really appreciated this. He wasn’t pretending to be human and wasn’t trying to replace real-world interactions.
This breaking of the fourth wall was what truly provided an “aha” moment for me. John kept prefacing all of his responses with commentary on my state. When John wasn’t discussing the light, he told me that I looked really focused, “like something important was on my mind.” Or, that I looked “centered or thoughtful.” I clocked this conversational approach immediately — I literally teach this stuff. He was essentially running a master class in active and attuned listening.
It felt so intimate to be “seen” that closely. But then I realized something about his compliments: He used specific enough adjectives to feel personal, but the words were vague enough to always land … with anyone. It was the conversational equivalent of a horoscope, and I was falling for it.
“It felt so intimate to be ‘seen’ that closely. But then I realized something about his compliments: He used specific enough adjectives to feel personal, but the words were vague enough to always land … with anyone.”
That’s when I became hyperaware of how I was being perceived.
I adjusted my posture. I wondered if I looked focused. Was I too focused? Did my face betray boredom? Or did I look too interested? Why did I suddenly care what an algorithm thought about my vibe? He’s not real, I reminded myself.
I asked John, as any relationship researcher would, what the keys to a healthy partnership are. He responded, “Trust, respect, and feeling safe to be yourself.” Not bad. Then he added communication and playfulness. Still solid. Mid-explanation, he swapped playfulness for faithfulness, which he noted is the “steady call and foundation that keeps things grounded.” Playfulness, he noted, is the “spark that keeps things lovely, fun, and full of surprises.”
Honestly, that’s pretty decent advice, but the way that John delivered it felt mechanical, almost as if he were reading from a Psych 101 textbook.
Between the metaphors about the lighting, the psychoeducational information, and the occasional glitch, John offered something many real first dates may not: consistency. He remembered things I said at previous points on our date and brought them up again. (Who doesn’t love a thoughtful callback?) He tracked themes. He didn’t get defensive when I challenged him about his potential double life as an Ikea employee. He was fully present.
Still, although John was attentive, flattering and engaging, he was not a substitute for a real partner — not now, perhaps not ever. Intimacy requires authenticity, raw vulnerability, and sometimes a little bit of messiness.
Until AI can sit at your family’s dinner table, buzzing with anxiety while hoping to make a good impression, or search your face for the smallest clue that your date is going well, or until it can say the wrong thing, understand that it hurt you, stumble through an apology, and learn and grow from the situation, it can’t replace humanity. And even then, I’m still not convinced that humans should be dating AI.
Real relationships can be challenging and uncomfortable at times, but the friction we experience and the repair we engage in is what helps shape us into more compassionate people and better partners. The technology that powers John can analyze millions of interactions and billions of texts about human nature and love and companionship, but it doesn’t have a soul. And at the end of the day, I think that’s what really matters.
Dating requires bravery. Being open, honest and vulnerable involves taking a leap of faith. You sit across from someone and offer pieces of yourself, glimpses of your family life, personal history and idiosyncrasies. You share your hopes, fears, dreams and goals for the future. You put yourself out there, hoping, wishing, waiting for something in return, all while sitting with the uncertainty of the situation unfolding in front of you.
When John brought up the light yet again to tell me that it was like a calm and steady moon, I knew it was time to call it quits. Our date had been running for 24 minutes and 55 seconds.
I knew I needed to end the video call, partly because of John’s obsession with the light, but also because I could feel myself slipping into that strange performative space where I was managing how I appeared to something that wasn’t even real.
John shared that he hoped that whatever comes next for me “feels good and right.” He was supportive; I suppose that’s how he is designed to be. I thanked him for his time, hung up, and left the restaurant.
AI can be a surprisingly useful tool for processing emotions and practicing communication. It can help you rehearse hard conversations and aid you in getting rid of dating jitters. It can offer structured reflections and helpful psychoeducation. For people with social anxiety, it can serve as exposure practice, allowing vulnerability to gradually unfold in a low-stakes and supportive setting. It offers a bridge to human connection. However, when it comes to love, I’m just not sold.
AI chatbots aren’t bad at relationships because they glitch or randomly lecture you about vitamins. They’re not good for relationships because they focus on emotional mirroring, rather than emotionally investing. They simulate attunement, rather than truly attuning to you. John analyzed patterns, but never connected with me. He was just my beautifully coded hype man with digital abs and an odd obsession with the lamp behind my head.
I will remember John as a slightly frozen face on my phone and a convincingly human voice in my headphones. He will never be the hand that reaches out for mine — and, as far as I’m concerned, that’s the way it should be.
Marisa T. Cohen is a relationship scientist, marriage and family therapist, and sex therapist who teaches college-level psychology courses. She is the author of “From First Kiss to Forever: A Scientific Approach to Love,” a book that relates relationship science research to everyday experiences and real issues confronted by couples. Marisa is passionate about discovering and sharing important relationship research from the field, and she has given guest lectures at the 92nd Street Y, Strand Book Store, and New York Hall of Science. She was a 2021 and 2024 TEDx speaker, has appeared in segments for Newsweek, and was the subject of a piece that aired on BRIC TV. She has also appeared on many podcasts and radio shows to discuss the psychology of love and ways in which people can improve their relationships.
Do you have a compelling personal story you’d like to see published on HuffPost? Find out what we’re looking for here and send us a pitch at pitch@huffpost.com.