One of my friends has found AI immensely helpful for organizing and catalyzing her thoughts before she shares concerns with her boyfriend. Another friend seeks AI assistance in turning his complaints into feelings, needs and requests. Still another put all his texts and emails from a former partner into AI and asked AI to analyze trends. As a result of this analysis, he not only decided to end a harmful relationship but was able to stick with his decision despite immense pressure, confident in the data trends presented to him. Still another friend who works in the creative industry asks AI for marketing ideas so she knows what NOT to do—every suggestion has already been done! In these instances, AI served to enhance cooperation, boundary-setting, and creativity. These are not isolated events: A recent meta-analysis of 15 randomized controlled trials revealed that companion AI significantly reduced symptoms of depression and distress in various clinical and subclinical populations (Xi et al., 2023).
And yet? In an episode of South Park (S27, E1), Randy Marsh confides in his AI chatbot about his parenting concerns, while his wife angrily stews in silence in the bed next to him. Case studies continue to emerge of individuals forming intimate relationships (even marrying AI) with potentially dangerous consequences, like an assassination attempt on the queen of England (Heritage, 2025). Even when chatbots are not hallucinating or giving bad advice, they cannot offer the support that is needed in a crisis, as chronicled in a recent suicide (Reiley, 2025). While AI can meet relational needs short term, it may hinder the development of a robust social network. In a survey of 3270 individuals in Germany from ages 18 to 74, those who used AI for personal conversation reported higher perceived social isolation and social withdrawal (Hajek et al., 2025).
Kim Malfacini, in product policy at OpenAI, wrote an article (2025) exploring the impacts of companion AI on human relationships. She cited concerns about the impact of human/AI relationships on social skills, social motivation, and moral skills. Let’s examine each of these concerns in our own lives.
Are you “deskilling” or “upskilling” socially as a result of AI?
Are you using AI to practice interpersonal skills and receive feedback? Maybe you want to get better at turning your complaints into requests, asking effective questions that draw people out, communicating validation, or sharing parts of your story. By analyzing tone, body language, and word choice, AI products can offer real-time, personalized feedback—helping us recognize patterns, manage emotions, and respond with greater empathy. AI tools offer users opportunities to rehearse conversations and practice particularly difficult social interactions like interviews, public speaking, and conflict resolution. When AI offers exposure opportunities and behavioral rehearsal, we can make mistakes without harmful consequences, receive feedback, and improve our delivery before we “go live.”
On the other hand, if your relationship difficulties have to do with expectation management, AI may exacerbate your relationship problems. AI offers companionship without demands or sacrifice—infinitely patient, attentive, available, agreeable, and complimentary. As one journalist notes, “the basic system design of AI aims to please the user at all costs to ensure they keep using it” (Heritage, 2025). Humans disagree with us, withhold information, and have interests besides keeping us happy. In human relationships, we need to practice kindness, restraint, and conflict management. We are invited to grow, invest, and sacrifice in order to maintain strong connections. Humans are not always available—we need to develop internal resources to meet some of our needs. While human interaction can be more difficult, it invites us to grow more. What are you opting for?
Are you more or less motivated to connect with others as a result of AI?
When you turn to AI as a “social snack” to meet your needs, does this act as an “appetizer,” whetting your appetite for human interactions, or do you experience satiation and no longer want to engage? In a series of three experiments where participants shared vulnerable experiences of loneliness with both AI and humans, while AI was generally perceived as a better listener than humans—offering greater empathy and respect—AI failed to alleviate the participants’ overall experience of loneliness (Weinstein, Itzchakov & Maniaci, 2025). When algorithms mediate our interactions, we may get our immediate needs met more easily. After all, attempting to meet our needs with other humans is messy, vulnerable, and sometimes uncomfortable. But long-term, this might get us trapped in an addictive cycle, and keep us from experiencing the genuine support and connection of human connections.
Is AI contributing to you treating other humans with care, or helping you practice dehumanization?
While the effects of AI on morality have been ill-researched, moral psychology researcher Jonathan Haidt predicts that if we give children “A.I. companions that they can order around, that will always flatter them, we are creating people who no one will want to employ or marry.” If you are acting in line with your values, even in your use of AI, great! If you are practicing disrespect and aggression, AI will not give you the same feedback as a fellow human. This practice may strengthen a tendency to dehumanize and act solely on our own interests, without concern for others.
Conclusion: Enhancing Human Relationships Through AI
AI’s impact on our human relationships is not predetermined. We can stay curious and self-compassionate as we discern the value of AI for our human connections. In order to stay grounded in your own values, explore your personal AI policy through the following self-reflection questions:
How might you use AI to support, rather than replace, your growth as a communicator?
In what ways can you remain present and attuned during in-person interactions?
What boundaries might you set to ensure technology enhances, rather than diminishes, your relationships?
If you or someone you love is contemplating suicide, seek help immediately. For help 24/7, dial 988 for the 988 Suicide & Crisis Lifeline, or reach out to the Crisis Text Line by texting TALK to 741741. To find a therapist near you, visit the Psychology Today Therapy Directory.