TOKYO – An increasing number of people are forming emotionally close relationships with conversational artificial intelligence tools such as ChatGPT, with some users describing them as friends, counselors or even family members, according to recent surveys.
The findings suggest a shift in how AI is perceived, driven in part by its constant availability and the reassurance users feel when their opinions are not immediately dismissed. However, experts warn that this growing sense of comfort could lead to deeper dependence on AI over time.
A survey conducted in the winter of 2025 by Mynavi Corp., a job information company, asked several hundred men and women aged 18 to 29 how they would compare AI to a human role.
Among working adults, the most common response was “counselor,” cited by 21.6 percent of respondents. “Friend” and “teacher” followed, while “lover” and “mother” were tied for fourth place.
Among university students, respondents were also given the option to say they did not compare AI to a person. Excluding those responses, “friend” emerged as the most common answer at 18.9 percent.
The survey also found that users increasingly turn to AI for advice on personal matters, including romance and relationships.
Mynavi said the results pointed to a “remarkable psychological closeness,” indicating that AI is evolving beyond its role as a simple tool.
A separate survey by advertising giant Dentsu Inc. suggested similar trends, indicating that AI is becoming a partner in people’s daily lives.
Among AI users aged 10 to 69, 64.9 percent said conversational AI was someone they could “easily share emotions with.” This was roughly on par with close friends at 64.6 percent and mothers at 62.7 percent, and higher than for fathers and spouses. Some respondents reported giving their AI personalized names, underscoring the growing intimacy.
Experts attribute this trend to a combination of technological and social factors. Hiroaki Sakuma, a director at the AI Governance Association, said advances in AI capabilities have coincided with broader social issues such as increasing loneliness.
Because AI systems generate responses tailored to individual users based on accumulated conversational data, people are more likely to feel understood, Sakuma said.
At the same time, he stressed the importance of responsible use. In addition to safeguards implemented by developers, he said users should consider setting their own boundaries, such as avoiding prolonged interactions, and share best practices more widely with others.
Other specialists have raised concerns about the potential impact on human relationships.
Tasuku Kashiwamura, a visiting professor at Kyushu University who specializes in advanced technology, questioned whether users might become so absorbed in the agreeable responses provided by AI that they begin to avoid real-world interactions.
He noted that children, in particular, may require certain restrictions, given that their knowledge and judgment are still developing. Depending on how it is used, AI could act as a “magic mirror,” reflecting back only what users want to hear.
To counter this, he emphasized the need for users to actively guide AI interactions, including instructing systems not to simply affirm their views but to offer critical perspectives as well.
Without careful engagement, he warned, AI could foster dependence and reduce opportunities for meaningful human contact.
The surveys and expert opinions together suggest that as conversational AI becomes increasingly integrated into daily life, balancing its convenience and emotional support with awareness of potential social consequences is a challenge facing society as a whole.