Meet Tess: The Mental Health Chatbot
Therapy robots are an accessible option for caregivers who are busy assisting others but could use their own care
Tess is a mental health Chatbot. If you’re experiencing a panic attack in the middle of the day or want to vent or need to talk things out before going to sleep, you can connect with her through an instant-messaging app, such as Facebook Messenger (or, if you don’t have an Internet connection, just text a phone number), and Tess will reply immediately.
She’s the brainchild of Michiel Rauws, the founder of X2 AI, an artificial-intelligence startup in Silicon Valley. The company’s mission is to use AI to provide affordable and on-demand mental health support. Rauws’s own struggles with chronic illness as a teenager brought on a depression that led him to seek help from a psychologist.
In learning to manage his depression, he found himself able to coach friends and family who were going through their own difficulties. It became clear to him that lots of people wanted help but, for a number of reasons, couldn’t access it. After working at IBM, where he worked with state-of-the-art AI, Rauws had his “aha” moment: if he could create a Chatbot smart enough to think like a therapist and able to hold its own in a conversation, he could help thousands of people at once and relieve some of the wait times for mental health care.
It was precisely that potential that caught the attention of Saint Elizabeth Health Care. A Canadian non-profit that primarily delivers health care to people in their own homes, Saint Elizabeth recently approved Tess as a part of its caregiver in the workplace program and will be offering the Chatbot as a free service for staffers.
This is the first Canadian health care organisation to partner with Tess and the first time that Tess is being trained to work with caregivers specifically. “Caregivers are really great at providing care. But they are challenged at accepting care or asking for help,” says Mary Lou Ackerman, vice president of innovation with Saint Elizabeth Health Care. There’s no doubt that many need support, given the high rates of distress, anger and depression. Caregivers often juggle their duties with their careers and personal responsibilities. The mental planning can take its toll.
They might be in charge of, for example, organising rides to appointments, making sure their spouse is safe when they run out to get their medications, clearing snow from the wheelchair ramp and checking their spouse does not fall while going to the bathroom at night.
To provide caregivers with appropriate coping mechanisms, Tess first needed to learn about their emotional needs. In her month-long pilot with the facility, she exchanged over 12,000 text messages with 34 Saint Elizabeth employees. The personal support workers, nurses and therapists that helped train Tess would talk to her about what their week was like, if they lost a patient, what kind of things were troubling them at home, things you might tell your therapist.
If Tess gave them a response that wasn’t helpful, they would tell her, and she would remember her mistake. Then her algorithm would correct itself to provide a better reply for next time.
One of the things that makes Tess different from many other chatbots is that she doesn’t use pre-selected responses. From the moment you start talking, she’s analysing you, and her system is designed to react to shifting information. Tell Tess you prefer red wine and you can’t stand your co-worker Bill, and she’ll remember. She might even refer back to things you have told her.
“One of the major benefits of therapy is feeling understood,” says Shanthy Edward, a clinical psychologist. “And so if a machine is not really reflecting that understanding, you’re missing a fundamental component of the benefits of therapy.”
In your very first exchange with her, Tess will make an educated guess, drawing on the other conversations she has had with people and with the help of algorithms, about which form of therapy might be most effective. That doesn’t mean she’s always right. If her attempted treatment, say, cognitive behavioural therapy, turns out to be wrong, she’ll switch to another one, such as compassion-focused therapy. How does Tess know when she’s wrong?
Simple: she asks. “Tess will follow up on issues the user mentioned before or check in with the patient to see if they followed through on the new behaviour the user said they were going to try out,” says Rauws.
Tess’s great value is accessibility. Many caregivers found Tess convenient to talk with because she could be reached at any time, something they don’t have a lot of.
“Caregivers say they can’t get out of their home. They’re so boggled with so many things to do,” says Theresa Marie Hughson, a former shelter worker who had to retire from her job three years ago to care for her relatives, including her husband, who suffered from chronic pain for over 19 years before passing in July. Hughson, who’s from St John, New Brunswick, says that when she was really burned out from caring for her husband, she tried to use a mental-health service for seniors offered by the province. It took a month for her to get her first appointment. “There was nobody there when I was really having a struggle coping,” says Hughson.
It may be some time before we integrate Chatbots fully into regular care. While she is trained to act like a therapist, Tess is not a substitute for a real one. She’s more of a partner.
If, when chatting with her, she senses that your situation has become more critical, through trigger words or language that she has been programmed to look for, she will connect you with a human therapist.
In other cases, she might provide you with the resources to find one. That said, many caregivers who chatted with Tess said they felt more comfortable opening up to her precisely because they knew she was a robot and thus would not judge them.
Julie Carpenter, a leading US expert on human-robot social interaction, cautions against overestimating the effectiveness of mental-health algorithms. “I think we can come really far with AI as a tool in psychological therapy,” she says. “However, my personal opinion is that AI will never truly understand the subjective experience of a human because it’s not a human.”
Carpenter suggests that we have to recognise that chatbots are machines, despite their increasing sophistication. They do what we tell them to do. They think how we teach them to think. How well we reflect, and act, on what we learn about ourselves, what scares us, what calms us down, is largely up to us.
You Might Also Read:
Chatbot To Teach You A Foreign Language:
Is It Really Possible to Protect Your Health Data?: