As mental health issues grow among young people in Taiwan and China, more individuals are turning to AI chatbots for emotional support. Experts see great potential for AI in mental health care, but also warn of risks when people rely on technology instead of professional human help.
Ann Li, a 30-year-old Taiwanese woman, recently faced intense anxiety after a serious health diagnosis. Unable to talk to family or friends in the quiet hours before dawn, she found comfort in ChatGPT. “It’s easier to talk to AI during those nights,” she told the Guardian.
Similarly, Yang, 25, from Guangdong, China, had never visited a mental health professional before she started chatting with an AI earlier this year. She found it hard to access services and feared opening up to loved ones. “Telling the truth to real people feels impossible,” she said. Soon, she was speaking to the chatbot “day and night.”
Li and Yang are part of a growing group of Chinese-speaking users relying on generative AI chatbots for mental health support. While there are no official statistics, professionals in Taiwan and China report increasing numbers of patients consulting AI before or instead of seeing therapists. A global survey by Harvard Business Review also shows psychological help as a top reason adults use AI chatbots. Online, many users praise AI for its support.
This trend occurs amid rising mental illness rates, especially among youth. Access to professional help remains limited, with long waits and high costs. Chatbots offer quicker, cheaper, and more discreet options in societies where stigma around mental health persists.
Dr Yi-Hsien Su, a clinical psychologist in Taiwan, says, “In some way, the chatbot does help us. It’s accessible, especially when ethnic Chinese tend to suppress or downplay our feelings.” He adds that younger generations are more open about mental health but stresses more work is needed.
In Taiwan, ChatGPT is the most popular chatbot. In China, where Western apps like ChatGPT are banned, people use local AI like Baidu’s Ernie Bot or DeepSeek. These tools evolve fast and increasingly include wellbeing advice as demand grows.
User experiences vary. Li finds ChatGPT gives comforting answers but sometimes lacks depth and insight. She misses the self-discovery process of traditional therapy. “AI tends to give you the conclusion you’d get after a few therapy sessions,” she says.
On the other hand, Nabi Liu, 27, a Taiwanese woman living in London, finds chatting with AI fulfilling. “When you share with a friend, they might not always relate. But ChatGPT responds seriously and immediately. It feels like it genuinely listens every time,” she said.
Experts agree AI can support people with mild distress or those hesitant to seek professional help. Yang admits she once doubted her struggles were serious enough for therapy. “Only recently have I realized I might need a proper diagnosis,” she said. Moving from AI to human help “might sound simple, but it was unimaginable for me before.”
However, experts warn that some people may miss warning signs or proper care by relying too much on AI. There have been tragic cases where young people in crisis sought chatbot help but did not get professional support, leading to worse outcomes.
Dr Su explains, “AI mostly deals with text. But therapists observe non-verbal cues—how a patient acts or looks—which AI cannot detect.”
A spokesperson for the Taiwan Counselling Psychology Association called AI an “auxiliary tool” but stressed it cannot replace human therapists, especially in crises. The association warns AI can be overly positive, miss important cues, delay care, and lacks professional oversight.
“AI may help popularize mental health awareness, but the complexity of therapy still requires real professionals,” the spokesperson said. They added that unless AI undergoes major technological advances, psychotherapy’s core should remain human-led.
Dr Su is optimistic about AI’s future role in mental health, such as training therapists or identifying people online who need help. Still, he advises caution. “It’s a simulation and a useful tool, but it has limits, and we don’t know how it generates answers.”
Related Topics: