Wir brauchen Ihre Unterstützung — Jetzt Mitglied werden! Weitere Infos
«People are more willing to  disclose to an AI than to a human»
Alison Darcy. Bild: 1IMAGE Photography/Bryan Brophy.

«People are more willing to
disclose to an AI than to a human»

Alison Darcy has founded Woebot, a company that offers therapy by a chatbot. She thinks that technology can help people with mental health issues and address problems of psychology as a science.

Lesen Sie die deutsche Version hier.

Alison Darcy, digital technologies are often blamed for causing mental health problems. But you insist that they can also help in curing them. Why?

Technology is a part of the way we live now. It’s a tool, and it can be used poorly. There are certain economic incentives that may not align with human well-being. For example, much has been written about the attention economy and how it has hijacked our emotions and attention. There are a lot of technology addiction problems in the world today. Nonetheless, these are the tools that we have in our pockets, which affords very exciting opportunities for help and assistance as well. Just because technology has been misused in the past doesn’t mean it can’t be used for good. At Woebot, we are focused on creating technology that is strictly controlled with human health outcomes in mind.


But you also have an economic incentive that people use your chatbot.

Lots of people who create medical devices and medical technology do so to earn financial reward – this enables that technology to be sustained. It’s about the way in which the experience is monetised. We don’t monetise it in the way the attention economy does, which tries to keep people in the experience as long as possible. That’s part of the reason why we don’t sell Woebot as a Direct-to-Consumer app. Instead, we partner with healthcare providers who pay for our product and freely distribute it among their patients as the physician or the clinician thinks is appropriate. We have to hope that this is a business model that works out.


You started Woebot in 2017. How did this come about?

We founded Woebot because the enormous need for mental health support in the population was very clear, as was the shortage of clinicians. We have two decades of research that shows that digital implementations of models like cognitive behavioural therapy are effective – in some cases as effective as therapist-delivered cognitive behavioural therapy. The key issue is engagement. So the question for me was: Can we create a digital solution that people would want to use every day? We experimented with various approaches and eventually created a chatbot. We could see immediately that the relational interface was something that people could immediately engage with.


How am I supposed to use Woebot?

Woebot can be used in different ways. Some use it regularly for general mental health maintenance, engaging in brief conversations daily. Others use it intensively for a few weeks during difficult periods, experiencing significant symptom reductions. There are also users who turn to Woebot in moments of distress, like taking a mental health «painkiller». Our goal is to provide support that fits into people’s lives without keeping them unnecessarily engaged.

«Our goal is to provide support that fits into people’s lives without

keeping them unnecessarily engaged.»

How many people are currently using Woebot?

I don’t know exactly. We’ve had 1.5 million users interact with Woebot when it was available for free on the App Store, but that’s not the case anymore, because we are working with regulators in the United States. We think it’s best to distribute it through the healthcare channels.


What’s your goal in in terms of the number of users?

I don’t have a goal in terms of overall numbers. I mean, we think that everybody could use Woebot. The goal is really to make sure that Woebot is being delivered to folks that need it.

Woebot App. Bild: woebothealth.com

Do you use user data to improve Woebot’s algorithm?

Yes, we use de-identified aggregate data to improve our algorithms› accuracy and effectiveness. User privacy and confidentiality are paramount, and we adhere to regulations to protect user data.


Are people comfortable using an algorithm as a therapist?

Surprisingly, yes. Studies have shown that people are more willing to disclose to an AI and have deeper conversations than if they talk to a human. Of course, human relationships are important, and Woebot will never replace those. But AI can provide effective in-the-moment support without the stigma or judgment often associated with human interaction.


I would assume that using Woebot increases screen time. Do you see that as a problem?

We design Woebot interactions to be brief, typically lasting five to ten minutes. I don’t think that’s necessarily a problem. It’s probably the best five to ten minutes of your day. Our primary focus is on symptom reduction, and once a user feels better, there’s no need to continue the conversation.


At what age can someone start using Woebot?

Right now, we just have deployments with adults. There are a few pilots in hospital settings for adolescents aged 13 and above.


You often stress the importance of access to mental health support. Now, therapists have become more accessible in recent years – and yet mental health issues have not decreased but increased. How do you explain that paradox?

Talking about your mental health became so destigmatized that we just had a huge surge of people with mild to moderate symptoms come on stream. The problem is that the health system today is not well designed to differentiate between symptom levels. The perceived wisdom is that if you’re suffering from a mental health problem, you should go straight to a therapist, whereas, in fact, many people don’t need to see a therapist. We have to start thinking about how to equip those people with the tools they need to get better quickly. If they’re not responding, we can up-level them to a higher level of care, rather than giving everybody the top level of care.

«Talking about your mental health became so destigmatized that we just had a huge surge of people with mild to moderate symptoms come on stream.»


So you don’t think we’re in a mental health crisis?

Yes, I do. We definitely are in a crisis. When you look at the data, we see alarming increases in mental health issues, particularly among young people. But we’re in a crisis because we haven’t innovated fast enough to meet demand.


In her new book «Bad Therapy», Abigail Shrier argues that there’s already too much therapy, especially for children, and that it can have damaging effects. She also writes critically about mental health apps, arguing that «the rising generation has already received a lot of therapy. Thanks to artificial intelligence, the rain shower may soon become a flash flood.» What’s your take on that?

The data doesn’t support the claim that children are over-therapized, quite the contrary. Talk to any therapist about how long their wait-list is. However, I absolutely agree that therapy can have damaging effects, and we should address the problems of the field. One of them is the so-called «research-practice gap», which means that what we have actually studied and what we know is effective is rarely practiced by clinicians. Psychology is probably the only branch of medicine where people actually get worse as they get more experienced. We need to fix these problems, and I think technology can really help to do that. Our approach with Woebot is systematic and very true to the original conceptualization of the model. I think that’s a strong foundation on which to build.


How do you see the future of AI generally?

If AI follows the «hype cycle», we’re now at «peak hype», where expectations are very high. Typically, we’ll see a lot of companies fail. Over time, things should stabilise, and the manufacturers that are creating solutions with real value should stick around. But it’s very early, and there’s so much about AI that has upended other assumptions about technology, for example Moore’s Law. The pace of progress is breathtakingly fast compared to other technologies.

The interview was conducted at the European Trend Day at the Gottlieb Duttweiler Institute.

Abonnieren Sie unsere
kostenlosen Newsletter!