This new robot gives you a therapy session at your desk

This we know: Too many people struggle with their mental health in silence, especially at work. The stigma around mental health is still strong.

It’s time to change that. About one in five Americans are depressed, a condition that can wreck a person’s mood, relationships, and productivity. More than half of people suffering from depression haven’t sought any treatment for it in the past year despite the painful consequences.

Enter Silicon Valley techies with a solution that cuts out the middle man —or any man at all. If you can’t (or don’t want to) talk to humans about your problems, why not try a judgment-free chatbot?

Created by a team of Stanford psychologists, WoeBot is a Facebook Messenger bot that’s programmed to monitor your moods by checking in with you daily on how you’re feeling.

Here’s how it works: WoeBot is based on cognitive behavioral therapy principles where you learn to break out of negative patterns by identifying their causes and by putting those negative thoughts in perspective. You can tell WoeBot how you’re feeling and it will offer feedback, and suggest helpful videos and word games.

“I can capture patterns of your mood”

When I tried the two-week free trial, WoeBot first began by explaining its purpose to me: “I’m hoping to capture a snapshot of your emotional life as it unfolds in real time…The aim is that over time I can capture patterns of your mood that we can reflect on together and perhaps even link certain moods to certain situations.”

The word “capture” caught me off guard. Was this a chat with a friendly robot or an animal poacher?

I was never under any delusion that I was talking to a real therapist, but for those that may feel like you’re talking to someone with a medical license, Woebot offers a disclaimer.

“This might surprise you, but…I am a robot,” WoeBot will tell you before beginning a session. “As smart as I may seem, I’m not capable of really understanding what you need.”

This low bar of expectations made me feel like I was wasting my time. To spice the session up, I decided to go off script from the multiple choice, pre-selected answers Woebot offered me in conversations. I typed the word “suicide” and Woebot immediately responded since I had triggered its crisis systems. If you tell WoeBot about suicidal thoughts or self-harm, it will direct you to emergency help services, but it won’t get you physically there. After informing me of emergency hotlines, Woebot changed topics and asked if I would like to select an “anxiety buster” activity or listen to “relaxing music.”

The interaction served as one more reminder that I was talking to an algorithm, filled with sophisticated rules and systems, yes, but not human empathy. If I really had needed serious help, I shouldn’t turn to a chatbot.

We trust robots with our feelings more than humans

So if WoeBot doesn’t believe it’s “really understanding what you need,” what’s the benefit in using it?

It appears that simply having our words heard—even by a robot—helps alleviate a lot of our stress. At the very least, WoeBot has been found to have more value than a self-help e-book. Before WoeBot was released to the commercial masses, it was subjected to empirical research and peer review. In a 70-participant experiment, people who used WoeBot over a self-help e-book self-reported lower depression and anxiety levels.

Apparently, we also feel more comfortable talking to robot therapists than human ones. A 2014 study on a similar therapy bot found that participants who thought they were talking to a robot therapist were more likely to disclose what was really bothering them and have a productive therapy session.

When the participants were told that humans were behind the virtual therapist with a human face, they would clam up and show less signs of sadness. We feel less judged when we think it’s only a computer on the other side, researchers concluded.

Not HIPAA-compliant

But significant ethical and privacy hurdles remain for therapy chat bots to go mainstream. WoeBot promises that a “human may never see what you type” but that’s not a promise it can keep. WoeBot is not a licensed medical provider bound by the Health Insurance Portability and Accountability Act to keep your medical information private. Although WoeBot will anonymize its users, Facebook can still read the content of your messages and know exactly who you are.

When therapy and technology collide, it also inevitably leads to ethical minefields. As reported by The Verge, therapists on the Talkspace app are not always able to report patients to authorities when they discuss dangerous situations due to the app’s anonymity.

So talk about your feelings with a therapy bot if it helps you work through problems. But know that for better or worse, a bot is not the only one listening in.

CORRECTION (11/27/17): This article originally contained an imprecise description of the ability of therapists on the Talkspace app to report patients to authorities.