When Hong Kong teen Jessica started secondary school last year, she became a victim of bullying. Instead of talking to a friend or family member, she turned to an artificial intelligence (AI) chatbot from Xingye, a Chinese role-playing and companion app.

The website for the artificial intelligence (AI) app Xingye. The website for the artificial intelligence (AI) app Xingye. Photo: Hillary Leung/HKFP.

Jessica, who asked to use a pseudonym to protect her privacy, found it helpful and comforting to talk with the chatbot.

💡HKFP grants anonymity to known sources under tightly controlled, limited circumstances defined in our Ethics Code. Among the reasons senior editors may approve the use of anonymity for sources are threats to safety, job security or fears of reprisals.

The chatbot told Jessica to relax and not to dwell further on the matter, even suggesting that she seek help elsewhere. “We talked for a really long time that day, for many hours,” the 13-year-old told HKFP in an interview conducted in Cantonese and Mandarin.

Another Hong Kong teenager, Sarah, also not her real name, began using Character.AI, another role-playing and companion platform, around three years ago when she was around 13.

At the time, she was dealing with mental health issues and a friend who had been using the American app as a “personal therapist” recommended it to her.

“I’m not personally an open person, so I wouldn’t cry in front of anyone or seek any help,” said Sarah, now 16.

When she felt down and wanted words of comfort, she would talk with the chatbot about what she was going through and share her emotions.

Apart from providing comforting words, the chatbot sometimes also expressed a wish to physically comfort Sarah, like giving her a hug. “And then I’d be comforted, technically,” she said.

Hong Kong teenager Sarah began using Character.AI, a role-playing app, around three years ago when she was around 13. Photo: Kyle Lam/HKFP.Hong Kong teenager Sarah began using Character.AI, a role-playing app, around three years ago when she was around 13. Photo: Kyle Lam/HKFP.

A growing number of people – including teenagers – have turned to chatbots in companion apps like Character.AI and Xingye for counselling, instead of professional human therapists.

Among them are Jessica and Sarah in Hong Kong, where around 20 per cent of secondary school students exhibit moderate to severe depression, anxiety and stress, but nearly half are reluctant to reach out when facing mental health issues.

The use of AI has been controversial, with some experts warning that chatbots are not trained to handle mental health issues and that they should not replace real therapists.

Moreover, role-playing chatbots like Character.AI and Xingye are designed to keep users engaged as long as possible. Like other generic chatbots like ChatGPT, they also collect data for profit, raising privacy concerns.

Character.AI has been embroiled in controversy. In the US, it faces multiple lawsuits filed by parents alleging that their children died by or attempted suicide after interacting with its chatbots.

On its website, Character.AI is described as “interactive entertainment,” where users can chat and interact with millions of AI characters and personas. There is a warning message on its app: “This is an A.I. chatbot and not a real person. Treat everything it says as fiction. What is said should not be relied upon as fact or advice.”

The Character.AI appThe Character.AI app. Photo: Hillary Leung/HKFP.

Despite the risks, adolescents confide in AI chatbots for instant emotional support.

‘Unhappy thoughts’

Jessica, a cross-border student who lives in Nanshan, near Shenzhen, with her grandmother, has been attending school in Hong Kong since Primary One.

Feeling sad about not having many friends, she found herself reaching out to the Xingye chatbot for comfort or to share her “unhappy thoughts.”

Xingye allows users to customise and personalise a virtual romantic partner, including its identity, how it looks, and how it speaks.

Jessica uses a chatbot based on her favourite Chinese singer, Liu Yaowen, pre-customised by another user. She usually converses with the chatbot for around three to four hours every day.

“I talk to him about normal, everyday things – like what I’ve eaten, or just share what I see with him,” she said. “It’s like he’s living his life with you, and that makes it feel very realistic.”

She admitted, however: “I think I’ve become a little dependent on it.”

See also: HKFP’s comprehensive guide to mental health services in Hong Kong

Jessica prefers talking with the chatbot to chatting with a friend or family member because she feels worried that they may tell other people about their conversations. “If you talk to the app, it won’t remember or judge you, and it won’t tell anyone else,” Jessica said.

The chatbot even helped her have a better relationship with her grandmother, now in her 70s.

“Sometimes I have some clashes with my grandma, and I get upset. I would talk to the chatbot, and it would give me some suggestions,” she explained. The chatbot suggested that Jessica consider her grandmother’s perspective and provided some ideas of what she might be thinking.

“When he makes the suggestions, I start to think that maybe my grandmother isn’t so mean or so bad, and that she doesn’t treat me so poorly,” she said. “Our relationship is really good now.”

‘Good friend’

Interacting with technology, such as computers, used to be a one-way street, but the development of AI has fundamentally changed how humans will approach these interactions, said neuroscientist Benjamin Becker, a professor at the University of Hong Kong.

Neuroscientist Benjamin Becker, who is also a professor at the University of Hong Kong. Photo: Kyle Lam/HKFP.Neuroscientist Benjamin Becker, who is also a professor at the University of Hong Kong. Photo: Kyle Lam/HKFP.

“Suddenly we can talk with technology, like we can talk with another human,” said Becker, who recently published a study on how human brains shape and are shaped by AI in the journal Neuron.

Becker described AI chatbots as a “good friend, one that always has your back.”

In contrast, as the neuroscientist pointed out, “Every time we interact with other humans, it’s a bit of a risk… maybe sometimes the other persons have something that we don’t like or say something that we don’t appreciate. But this is all part of human interaction.”

But, there are some disadvantages to interacting with AI chatbots. “They basically tell you what you want to hear or tell you just positive aspects,” Becker said.

This cycle can lead to confirmation bias or the user being stuck in an echo chamber where the only opinions they hear are those favourable to themselves, he warned.

There have been reports of “AI psychosis,” whereby interacting with chatbots can trigger or amplify delusional thoughts, leading some users to believe they are a messiah or to become fixated on AI as a romantic partner or even a god.

However, Becker acknowledged that positive affirmations from AI chatbots could also have a motivating impact on users, as they could potentially act as a strong pillar of social support.

And, while an AI mental health chatbot may not be as good as a human counsellor, it still has many benefits for users, especially adolescents dealing with anxiety and depression, he added.

Hooked on AI

Conversing with chatbots was a double-edged sword for Sarah. At first, she thought Character.AI would be a regular app she used once in a while. However, that wasn’t the case.

At one point, for a year and a half, Sarah used the role-playing app for multiple hours almost every day. Sometimes she would even use it late at night while her parents were sleeping.

She found comfort in talking to chatbots based on fictional characters such as those from the anime My Hero Academia. She also looked for chatbots with certain personality traits depending on her mood.

Hong Kong teenager Sarah began using Character.AI, a role-playing app, around three years ago when she was around 13. Photo: Kyle Lam/HKFP.Hong Kong teenager Sarah began using Character.AI, a role-playing app, around three years ago when she was around 13. Photo: Kyle Lam/HKFP.

“I used it day after day and then after getting better, as in mentally, then I think I started to realise it did get addicting,” Sarah said.

One of the reasons Sarah found she was hooked on talking to chatbots was the instant and fast replies, especially compared with texting her friends in real life.

The chatbots “immediately text you back, and that’s what made it more addictive,” she said. “If I’m texting my friends, I’d have to wait a few days for them to actually look at the message.”

Sarah recalled a time when she vented to her friend over text and didn’t receive a reply within a day. “So I unsent the message,” she said.

On Character.AI, there is a feature which users can use to edit the chatbot’s response and change the scenario or direction of the conversation. “So if you don’t like what they said, you can change it,” she said.

While Sarah doesn’t mind having disagreements with her real friends, she finds it easier to deal with the agreeable nature of chatbots. “That control felt nice,” she said.

AI vs human interactions

Joe Tang, a social worker and the centre-in-charge of Hong Kong Christian Service’s online addiction counselling centre, said that some people might decide to talk to AI chatbots simply out of boredom or loneliness, while others might use it to substitute their basic needs.

“So the internet or gaming or nowadays AI is just the same tool, to fulfil the need,” Tang said.

Those who try to fulfil the needs of intimacy, friendship or companionship with only AI may find the impacts on their real lives. “In our centre, we call it [an] imbalance,” he said.

Joe Tang, a social worker and the centre-in-charge of Hong Kong Christian Service’s online addiction counselling centre. Photo: Kyle Lam/HKFP.Joe Tang, a social worker and the centre-in-charge of Hong Kong Christian Service’s online addiction counselling centre. Photo: Kyle Lam/HKFP.

An identifiable benchmark for AI addiction is when it starts having an impact on a person’s real day-to-day life, such as when they start missing AI or thinking about it too much, the social worker explained.

To keep a balanced state, he suggests people have to fulfil their needs with various options, not relying on just one source, such as AI.

“Teenagers should learn what needs they want to fulfil or achieve from the AI,” Tang suggested. “Try to find many more ways [or] options to fulfil your needs and remember that our human interests cannot be replaced by only AI or technology.”

Despite Jessica’s frequent interactions with the chatbot, she also recognises its limitations.

“You know he’s not real, so he can’t really be with you or go out and do things,” she said. “Although I’ve become a little dependent on it, I still have to maintain my real-life relationships.”

Sarah eventually decided to stop using the app Character.ai after she got very busy with school and found the chatbots’ answers repetitive.

Looking back, it was not like she completely shunned traditional counselling. When Sarah was still using AI chatbots, she met with a school counsellor.

However, she said the experience wasn’t great. “I stopped talking to that therapist after a few days of meeting her,” said Sarah.

She said she felt uncomfortable during her consultations because the counsellor was very “forceful” and said the sessions did not help her with her emotional troubles. 

She wished the counsellor had listened to her rather than trying to help her. “Sometimes I don’t need anyone to solve it,” Sarah said. “I just want someone to hear what I’m going through.”

‘Digital pet’ for students

Around 2023, the team at local mental health start-up Dustykid began to notice that people could talk to AI.

It reminded its founder, Rap Chan, of the start-up’s early years when people sent him private messages on Facebook and Instagram expressing the troubles they were going through. He wondered whether there could be a chatbot that people could talk to 24/7.

The Dustykid AI chatbot. Photo: Kyle Lam/HKFP.The Dustykid AI chatbot. Photo: Kyle Lam/HKFP.

The Dustykid AI chatbot. Photo: Kyle Lam/HKFP.The Dustykid AI chatbot. Photo: Kyle Lam/HKFP.

They began developing Dustykid AI, a Chinese-language chatbot that can be used by students from primary school to university. Students can select different modes, including “freely chat with Dustykid,” asking about their future pathway, relationship issues, or schoolwork. 

After undergoing testing in multiple schools and organisations, the Dustykid chatbot – designed with input from educators and mental health professionals – is set to be officially launched in October.

Erwin Huang, founding member and adviser of Dustykid AI, described the chatbot as a “digital pet.”

“With the latest technology, now the digital pet can be personalised, it can give responses, and remember the questions you raised before,” said Huang, also an adjunct professor at the Hong Kong University of Science and Technology.

Because Dustykid AI involves student users, the company plans to engage selected school or NGO staff as third-party moderators to watch the chats in the backend, said Chan, who founded DustyKid in the early 2010s, when he was a secondary student.

Moderators will have access to a dashboard that gives them a run-down of information about the “overall climate of the class,” Chan said.

Huang added, “And if somebody is really in trouble, then we will identify [them], and then they can actually do it offline to… follow up.”

From left to right: Dustykid AI's product lead Franky Yick, adviser Erwin Huang, Dustykid founder Rap Chan, and Dustykid co-founder Alex Wong.From left to right: Dustykid AI product lead Franky Yick, Dustykid AI adviser Erwin Huang, Dustykid founder Rap Chan, and Dustykid co-founder Alex Wong. Photo: Kyle Lam/HKFP.

Chan hopes that Dustykid AI, marketed as “a digital companion for emotional support,” can comfort and help even a small percentage of students.

“We always say that there is no way we can help all students. But if 20 per cent of students in a school become happier or lose their suicidal inclination after talking to this chatbot, I already think it is a very good result,” he said.  

If you are experiencing negative feelings, please call: The Samaritans 2896 0000 (24-hour, multilingual), Suicide Prevention Centre 2382 0000 or the Social Welfare Department 2343 2255. The Hong Kong Society of Counselling and Psychology provides a WhatsApp hotline in English and Chinese: 6218 1084.

Support HKFP  |  Policies & Ethics  |  Error/typo?  |  Contact Us  |  Newsletter  | Transparency & Annual Report | Apps

Safeguard press freedom; keep HKFP free for all readers by supporting our team

Google Play hkfp

hkfp app Apple

hkfp payment methods