BluBot: Your AI Therapist Will See You Now
Project Summary
Context & Role
MFA Thesis at School of Visual Arts Products of Design
Timeline
4 weeks, May 2019
Project Team
Evie Cheung supported by Justin Paul Ware, Carly Simmons, Gustav Dyrhauge, Ellen Rose
Overview
BluBot was originally envisioned as a public-facing intervention to explore the intersection of mental health and artificial intelligence, prior to the widespread advent of generative AI.
This was a designed as a public experience that took place on March 24, 2019 in Union Square Park in Manhattan, in which participants could engage in 5-minute lightning round “therapy sessions” with BluBot, an AI-powered psychotherapist in training.
In hindsight, BluBot can be used as a study to rethink how an application of generative AI could have developed, with ethical checks and balances—and more intentional design.
Challenge
What if AI could increase access to mental health services and help mitigate bias within psychotherapy?
Therapy is often inaccessible to many groups, often due to economic and cultural barriers. It is a costly endeavor, in most areas of the u.S., patients can expect to pay about $100 - $200+ per session. Culturally, it is a practice that is shrouded in stigma—especially for communities of color.
Approach
Each participant who entered the BluBot booth had five minutes of “AI-powered therapy,” guided by BluBot’s questions and prompts. BluBot was intentionally designed as a "psychotherapist in training,” not meant to replace humans, but to learn from them.
While public AI therapy may seem absurd, it’s important to note that humans often have a difficult time talking to other humans. By using an AI as a therapist, there is the potential to mitigate feelings of judgment that may occur in a traditional psychotherapy environment.
Outcome
Created behavioral prototype in order to investigate and evaluate humans’ expectations and motivations for interacting with an AI therapist
Insights about peoples’ responses to AI as a therapist
Insights regarding conversational AI design for possible mental health use
Designing conversational AI to build trust
In order to establish a relationship based on trust between BluBot and the participant, I purposely scripted BluBot to first ask the opening question:
”I’m trying to learn about something called emotions. My creator told me that humans are very emotional. Can you tell me what an emotion is?”
BluBot let the participant know that as an AI in training, it was not an expert. The participant then had an opportunity to teach BluBot something right off the bat.