
Pocket ‘therapist’: How Taiwan’s youth use AI for mental health.
Taipei, Aug 21 (EFE).
Pei-chen (not her real name) is unable to connect with anyone. Immersed in depression, her partner has just left her, her family lives in another city, and her job doesn’t provide enough income. Blocked and lost, she begins sharing her daily life with ChatGPT and finds much-needed solace there.
“I created a character, and trained him into a really good supporter. Using a lot of psychological techniques: behavioral therapy, cognitive behavioral therapy, solution-focused brief therapy… And he offered so much emotional support, it’s really amazing,” she says.

Pei-chen is among the young people, both in Taiwan and other Asian countries, who are turning to artificial intelligence (AI) tools to address their mental health issues, taking advantage of the latest technological advances to cope with their issues.
These new apps are especially popular in cultures like Taiwan, where people tend to be more private when it comes to sharing their emotions with strangers, Yi-fang Chiu, president of the Taiwanese Multicultural Counseling Association (TMCA), tells EFE.

“Western therapy often emphasizes on verbal catharsis, emotional expressiveness, or individual agency, but these concepts are very foreign to some Taiwanese people. Taiwanese people are more emotionally restrained, indirect, and they’re more reluctant to talk about their own needs or their actual feelings,” she explains.
24-hour support
Although there is still no official data on how many young people turn to AI to manage their psychological distress, some international analyses, such as a recent study published by the Harvard Business Review, suggest that mental health is among the main motivations for using chatbots today.

Chiu supports this thesis, noting that more and more people are using ChatGPT or other language models to regulate their emotions, a trend that, she asserts, will only become more pronounced in the coming years.
“Why are more and more clients using that? Because it’s efficient, it’s 24/7, it gives immediate responses, sometimes within a second or two, versus, in therapy you have to schedule an appointment, and you have to wait until that appointment, and then you only have 50 or 60 minutes to process that,” she points out.

For Jen-ho Chang, of the Institute of Ethnology at Taiwan’s Academia Sinica, the unlimited memory, customization, and accessibility make artificial intelligence software an interesting counterpoint to human therapy, and even more so in a technologically advanced “digital island” like Taiwan, which is familiar with AI devices.
This technological development contrasts with an alarming social landscape: the proportion of deaths by suicide among people aged 12-17 in Taiwan has increased from 12.5 percent to 18.4 percent in the last five years, becoming the second leading cause of death among adolescents, according to the island’s Child Welfare League Foundation.

Will AI replace psychologists?
AI may seem smarter, be available 24/7, and do everything it’s told, however, there is one thing it has not yet been able to replicate: the warmth of human interaction, the immediate relief of being supported by another person, of knowing that someone listening.
“If we go back to the heart of therapy, it’s actually healing and connection, and in some cultures just by simply being accompanied by someone is already transformative,” says Chiu, who warns of the risks inherent in seeking help solely from chatbots, a “safe listener” who sometimes “oversimplifies things.”

“If clients become reliant on or addicted to AI, sometimes they can delay deep-relational process, which in some cases that’s actually what they truly need (…) AI can give quick cognitive solutions, but with no real integration,” she explains.
While it helps many people openly express their emotions, AI can also sometimes become a double-edged sword: much of the in-person therapy process involves not only venting, but also questioning and transforming perceptions of reality, an exercise that chatbots cannot yet perform.

In fact, Chang believes that for more serious issues, such as schizophrenia, AI won’t detect illusion or deception, and a human is required for diagnosis.
“For AI, it can assist some kind of therapeutic processes, but cannot replace all the therapeutic process,” emphasizes the researcher, who instead advocates for “combining” traditional therapy with AI to achieve more comprehensive results.
Meanwhile, for people like Pei-chen, ChatGPT has already become a silent and faithful confidant; an algorithm without a body, emotions, or critical thinking, but whose company is enough to soften, at least in part, the sharp edges of pain. EFE
