FB Pixel no scriptAI may offer a listening ear, but can it close mental healthcare gaps?
MENU
KrASIA
Insights

AI may offer a listening ear, but can it close mental healthcare gaps?

Written by 36Kr English Published on   5 mins read

Share
AI-powered therapy tools are emerging in the mental health space, but questions around empathy, safety, and readiness remain unresolved.

What’s the most promising use case for large models in healthcare?

From imaging scans and pathology diagnostics to artificial intelligence agents that simulate doctors, tech companies are working to embed AI across hospitals. One area that has drawn particular attention is mental health.

The appeal of AI as a low-cost mental health companion is growing. At its core, therapy centers on listening and empathy, and these are skills that generative AI happens to be quite good at simulating. AI doesn’t tire, it’s available at all hours, and with sufficient data, it can sometimes outperform junior counselors.

This potential has caught the eye of investors. In February, Haoxinqing, one of China’s major online mental health platforms, secured a new funding round in the eight-figure RMB range, announcing a pivot toward developing AI-driven mental wellness robots. Startups such as Scietrain and Mirror Ego are exploring niches with emotionally intelligent AI companions or clinically oriented tools.

But is AI really ready to hold human emotions in trust? Can it take on the high-stakes work of mental health risk assessment and decision-making?

Better for venting than treating?

Large language models are good at solving problems through conversation, making them a natural fit for therapeutic contexts. In fact, efforts to use computers in counseling date back decades, but earlier limitations meant responses felt mechanical and emotionally flat.

Now, thanks to advances in AI and NLP (natural language processing), things are different. A model trained on psychology texts and therapy transcripts can, in theory, hold conversations with more depth and contextual awareness.

Major tech firms are actively testing these ideas. JD Health has introduced a companion app powered by its proprietary Jingyi Qianxun model. Alibaba Cloud’s Tongyi Xingchen is similarly geared toward companionship-based applications.

Others are investing rather than building. In 2023, Baidu backed Scietrain, which developed a general-purpose model used in various mental health tools. According to the company, its services have reached nearly ten million users.

These platforms emphasize emotional intelligence. They aim to guide users through emotional distress, reflecting a broader trend toward conversational support in AI-based therapy.

Demand is rising. Social and workplace pressures, combined with greater awareness, have made conditions such as anxiety, depression, and insomnia more visible. But professional resources remain limited. Many hospitals underinvest in psychiatric care, and in rural or lower-tier regions, mental health services may not exist. Outside hospital systems, therapy is costly and uneven in quality.

Lu Wei, a psychiatrist and deputy director at Wenzhou Kangning Hospital, recalled that years ago, most patients in his clinic presented with severe conditions such as psychosis. Now, more people arrive with mild emotional distress. “They are just dealing with emotional stress and don’t know where else to turn, so they end up coming to us,” he said.

In such cases, AI could offer an accessible first stop or alternative route to care.

In March, The New England Journal of Medicine published a study by researchers at Dartmouth College evaluating Therabot, an AI-driven mental health intervention tool. In a randomized, double-blind trial involving 210 patients with major depressive disorder or generalized anxiety, participants who interacted with Therabot for six hours across four weeks saw their depression symptoms drop by 51% on average. Anxiety symptoms dropped by 31%. That equates roughly to eight traditional therapy sessions in exposure, and follow-up surveys suggested that patients reported trust in Therabot “comparable to that of a human therapist.”

The researchers emphasized that Therabot was not intended to replace clinicians but to fill service gaps and expand access. Chinese developers seem to share the same philosophy.

That said, the field remains nascent. Even advanced apps often feel formulaic. Users describe a pattern: the AI offers comfort, signals empathy, then lists solutions in a composed manner.

This may reflect the nature of the training data. Conventional therapy sessions follow a counselor-led structure. But in AI-driven chats, users control the pace and topics, disrupting that rhythm. If a model is trained only on scripted sessions, it may struggle to adapt to free-form conversations.

There are also serious safety concerns. One licensed counselor described a case in which an AI failed to recognize subtle suicidal ideation in a user’s messages and even encouraged them to “be brave and act on their feelings.”

In traditional clinical settings, therapists can escalate care, refer patients for psychiatric support, or prescribe medication. AI lacks those safety mechanisms.

“It’s best to think of AI as a pressure valve,” the same counselor said. “It’s not about diagnosing or curing. It’s just a way to say what you need to say, and sometimes, that’s enough.”

Efficient for screening, assistance, and recovery support

Business viability is another hurdle. Many AI therapy tools target consumers directly, but converting casual users into paying subscribers is difficult, even where demand exists.

Online mental health services have experienced waves of growth before. During the pandemic, platforms like MyTherapist, Yidianling, and Easy Psychology, along with “internet hospitals” such as Haoxinqing and Zhaoyang Doctor, saw spikes in use.

Still, most platforms struggled to turn online therapy into a standalone business. Revenues often relied on partnerships with drug companies or hospitals, not therapy sessions themselves.

The same challenge applies today. “Unless kids start chatting with AI all the time, subscription models won’t scale,” one developer said, noting their company had no plans for a consumer-facing launch.

Instead, firms are focusing on narrower roles within care delivery where AI can automate repetitive tasks.

One startup building digital tools for brain disorders developed an AI agent for cognitive training in users with impairments. “The exercises are simple and structured,” a team member explained. “They used to be delivered manually, but now AI can do it consistently.”

Another strategy is targeting institutional clients. Mirror Ego, for example, offers a suite of mental health services covering screening, diagnosis, and intervention. Its clients include schools, government offices, and businesses. By the end of 2024, it had worked with more than 200 schools in China.

Some large platforms are integrating AI therapy tools into their broader healthcare ecosystems. JD Health, which generates significant revenue from pharmaceutical sales, has embedded mental health tools into its services. According to Chinese outlet HealthInsight, while JD Health’s AI tool is free, it refers about 1% of users to paid services like therapy or medication.

That’s a small conversion rate, but even marginal gains matter in a saturated market.

Compared to established platforms, companies focused on AI-first mental health solutions are still in early development. “It’s early days,” one industry veteran said. “Things are moving fast, but no clear winners have emerged. It’s too soon to tell who’s going to lead, or even where the finish line is.”

KrASIA Connection features translated and adapted content that was originally published by 36Kr. This article was written by Hu Xiangyun for 36Kr.

Share

Auto loading next article...

Loading...