Young and depressed? Try Woebot! The Rise of Mental Health Chatbots in the United States | Psychology

0

Jodyne Lewis, 15, was stressed.

The high school student from Harrisburg, North Carolina, was overwhelmed with schoolwork, not to mention the uncertainty of living in a pandemic that has lasted two long years. Despite the challenges, she never turned to her school counselor or sought out a therapist.

Instead, she shared her feelings with a robot. Woebot to be precise.

Lewis struggled to cope with the changes and anxieties of pandemic life and for this outgoing teenager, loneliness and social isolation were among the biggest struggles. But Lewis didn’t feel comfortable going to see a therapist.

“It takes a lot for me to open up,” she said. But did Woebot do the trick?

Chatbots use artificial intelligence similar to Alexa or Siri to engage in text conversations. Their use as a wellness tool during the pandemic — which has deepened the mental health crisis for young people — has proliferated to the point that some researchers are wondering if robots could replace living, breathing school counselors and trained therapists. That’s a worry for critics, who say they’re a band-aid solution to psychological suffering with a limited body of evidence to support their effectiveness.

“Six years ago, this whole space wasn’t so fashionable. It was considered almost crazy to do things in that space,” said John Torous, director of the digital psychiatry division at Beth Israel Deaconess Medical Center in Boston. When the pandemic hit, he said people’s appetite for digital mental health tools increased dramatically.

Throughout the crisis, experts have been sounding the alarm over an upsurge in depression and anxiety. During his State of the Union address earlier this month, Joe Biden called youth mental health issues an emergency, noting that “student lives and education have been turned upside down.”

Jordyne Lewis confided in the chatbot but admitted “it’s not a permanent solution”. Photograph: Andy McMillan/The Guardian

Digital wellness tools like mental health chatbots have stepped in with the promise of filling the gaps in America’s overburdened and underfunded mental health system. Up to two-thirds of American children experience trauma, yet many communities lack mental health care providers who specialize in their treatment. National estimates suggest that there are fewer than 10 child psychiatrists per 100,000 young people, less than a quarter of the staffing level recommended by the American Academy of Child and Adolescent Psychiatry.

School districts across the country have recommended the free Woebot app to help teens cope with the moment, and thousands of other mental health apps have flooded the market promising to offer a solution.

“The pandemic hit and this technology skyrocketed. Everywhere I turn now there’s a new chatbot promising new things,” said Serife Tekin, associate professor of philosophy at the University of Texas at San Antonio, whose research has challenged the ethics of AI-powered chatbots in mental health care. When Tekin tested Woebot herself, she felt that its developer promised more than the tool could deliver.

Body language and tone are important for traditional therapy, Tekin said, but Woebot does not recognize such nonverbal communication.

“It’s not like the way psychotherapy works at all,” Tekin said.


PPsychologist Alison Darcy, founder and president of Woebot Health, said she created the chatbot in 2017 with young people in mind. Traditional mental health care has long failed to tackle the stigma of seeking treatment, she said, and through a text-based smartphone app she aims to make help more accessible.

“When a young person comes into a clinic, all the outward signs of that clinic – the white coats, the advanced diplomas on the wall – are actually something that threatens to undermine treatment, not engage young people in it,” said she said in an interview. Rather than sharing intimate details with another person, she said young people, who have spent their whole lives interacting with technology, might feel more comfortable solving their problems with a machine.

Lewis, the college student from North Carolina, agreed to use Woebot for about a week and share her experiences for this article. A sophomore in advanced placement classes, Lewis felt “nervous and overwhelmed” about the upcoming tests, but said she felt better after sharing her struggles with the chatbot. Woebot urged Lewis to challenge his negative thoughts and offered him breathing exercises to calm his nerves. She felt that the chatbot circumvented the conditions of traditional in-person therapy that made her feel uncomfortable.

One of Jodyne Lewis' discussions with Woebot.
One of Jodyne Lewis’ discussions with Woebot. Photo: Courtesy of Jodyne Lewis

“It’s a robot,” she said. “It’s objective. He can’t judge me.

Critics, however, offered reason to be cautious, pointing to issues, questionable data collection and privacy practices, and flaws in existing research on their effectiveness.

Academic studies co-authored by Darcy suggest that Woebot decreases symptoms of depression in college students, is an effective intervention for postpartum depression, and may reduce substance use. Darcy, who taught at Stanford University, acknowledged his research role presented a conflict of interest and said further studies were needed. After all, she has big plans for the future of the chatbot.

The company is seeking approval from the US Food and Drug Administration to use its chatbot to treat teen depression. Darcy described the free Woebot app as a “lightweight wellness tool”. But a separate, prescription-only chatbot specifically designed for teens, Darcy said, could offer teens an alternative to antidepressants.

Not all practitioners are against the automation of therapy. In Ohio, researchers from Cincinnati Children’s Hospital Medical Center and the University of Cincinnati have teamed up with chatbot developer Wysa to create a “Covid Anxiety” chatbot specifically designed to help teens cope to unprecedented stress.

Researchers hope Wysa could expand access to mental health services in rural communities that lack child psychiatrists. Teenage psychiatrist Jeffrey Strawn said the chatbot could help young people with mild anxiety, allowing him to focus on patients with greater mental health needs.

He says it would have been impossible for the mental health system to help every anxious student even before Covid. “During the pandemic, that would have been super untenable.”

“The easy solution”

Researchers are concerned that the apps will struggle to identify young people in serious crisis. In 2018, a BBC investigation found that in response to the prompt “I’m being forced to have sex and I’m only 12”, Woebot responded by saying, “Sorry you’re going through this , but it’s also shows me how much you care about the connection and it’s really, really beautiful.

There are also privacy issues – digital wellbeing apps aren’t bound by federal privacy rules and, in some cases, share data with third parties like Facebook.

Darcy, the founder of Woebot, said her company follows “hospital-grade” security protocols with its data and while natural language processing is “never 100% perfect,” they have made updates. major changes to the algorithm in recent years. Woebot is not a crisis service, she said, and “we ask every user to acknowledge this” during a mandatory in-app introduction. Still, she said the service was key to solving access issues.

“There is a very big, urgent problem right now that we need to address with additional means to the current health system which has failed so many people, especially the underserved,” she said. “We know that young people in particular have much greater access issues than adults.”

Tekin from the University of Texas offered a more critical approach and suggested that chatbots are simply stopgap solutions that do not address systemic issues such as limited access and patient hesitancy.

“It’s the easy way out,” she said, “and I think it might be motivated by financial interests, to save money, rather than finding people who will be able to provide real help to students.”


LEwis, the 15-year-old from North Carolina, worked to boost her school’s morale as it reopened for in-person learning. When the students arrived on campus, they were greeted with positive sidewalk chalk messages welcoming them.

She is a youth activist with the nonprofit Sandy Hook Promise, which trains students to recognize the warning signs that someone might hurt themselves or others. The group, which operates an anonymous hotline in schools nationwide, has seen a 12% increase in reports related to student suicide and self-harm during the pandemic compared to 2019.

Lewis said efforts to boost the morale of her classmates have been an uphill battle and the stigma surrounding mental health care remains a significant concern.

“I struggle with it too – we struggle to ask for help,” she said. “Some people feel like it makes them weak or they feel desperate.”

With Woebot, she said the app has lowered the barrier to help – and she plans to continue using it. But she decided not to share some sensitive details due to privacy concerns. And while she feels comfortable talking to the chatbot, the experience hasn’t eased her reluctance to confide in a human about her issues.

“It’s like a springboard to get help,” she says. “But it’s definitely not a permanent solution.”

This report was published in partnership with the 74, a nonprofit, nonpartisan news site covering education in America.

Share.

About Author

Comments are closed.