Evie Cheung - UX Designer + Creative Strategist
rookee-project.jpg

Rookee

Peachee

Rookee [In Progress]

 

Using Conversational AI to Promote Diversity and Inclusivity

 
app-screens-diagonal.001.jpeg

OVERVIEW

CONTEXT
MFA Products of Design, School of Visual Arts

Timeline
15 weeks

CATEGORY
Strategy, Design Research, Digital Product Design, Co-Creation, Voice UI

With guidance from Rebecca Silver, Brent Arnold, and Allan Chochinov


Fighting Individual Racial and Gender Bias with Voice UI at a Young Age

Note: This project is still in progress.

Rookee is a conversational AI that uses a voice UI and app to emulate your family’s language, voice, and accent, wherever they may come from. Rookee is specifically targeted to children, ages 3 - 7 years old, as that is when a significant amount of language development takes place and children are transitioning from home life to school.

OPPORTUNITY

How might we use conversational AI as a means to counteract individual racial and gender bias in young children?

How might voice UI be a tool to foster inclusivity?

CHALLENGE

Voice UIs today, such as Amazon’s Alexa or Apple’s Siri, are created to be servile. They are not representative of the world’s diversity. At scale, they can perpetuate gender and racial biases.


STRATEGY


Identifying an Opportunity: Theory of Change

To begin to understand the systems at play, a theory of change was constructed through the creation of a transformation map. Then, the problem was diagnosed at a systemic high level before drilling down into a smaller microcosm of the systemic issue.

Click to view larger.

Click to view larger.


CO-CREATION WORKSHOP

MY ROLE
Workshop Creation and Facilitation, Branding, Content Design

COLLABORATORS
Justin Paul Ware
Antya Waegemann (Video)
John Boran (Photography)


A Cross-Sector Co-Creation Exercise

On November 18, 2018, thirteen professionals from across seven industries gathered to discuss the future of artificial intelligence.

Participants explored the embedded values in ubiquitous AI-powered products and services, including Amazon Alexa. They raised important questions such as, “What if Alexa raised children?”

After listening to Alexa’s voice, participants drew illustrations of what Alexa would look like if they were a human being, and then were asked several questions about Alexa’s race, political beliefs, and personality. From the information gathered, all participants responded that Alexa was a white woman. Participants also believed she couldn’t think for herself and was pushing a libertarian agenda.

In a vacuum, these findings are pretty hilarious. But what if you are a child growing up with this technology in your household? What are the developmental impacts that Alexa may have on you?


PRIMARY RESEARCH


Qualitative Interviews with Experts

I had the opportunity to talk with over 30 subject matter experts on the topics of artificial intelligence, bias, inclusivity, and socialization. This included conversations with machine learning engineers, ethical technologists, behavioral and developmental psychologists, activists, and PhD candidates.

smes.001.jpeg

INSIGHTS

TENSIONS

Important discussion centered around the following questions:

  • Will AI make us more or less human? What are the opportunities and consequences of both

  • How will AI and machine learning either equalize or further exacerbate existing inequalities in society?

  • How will the scale of AI-powered products and services impact human behavior and interaction—particularly socialization of the next generation?


RAPID PROTOTYPING


Making Amazon Alexa More “Human”

Thinking about the insights drawn from my conversations and research, I began to wonder how we could make current AI-powered products and services more “human.” I started with smart speakers and focusing on Amazon Alexa. In many ways, Alexa’s voice UI is built to mirror the way that humans use language to communicate with each other. But Alexa’s form does not match the voice. I began exploring what it would mean to add a human element to it—in the form of bluetooth googly eyes and eyebrows.

rookee-v1.001.jpeg

The eyes would be responsive as a user interacts with it. If the user treats Alexa with respect, the eyes will remain happy. If the user treats Alexa poorly, the eyes will roll and eventually become angry and lock.

app-screens-diagonal.001.jpeg

Then, I also explored what it would mean for Alexa to have different language, dialect, and accent options. How might we make voice UI more representative of the diverse world around us?


FUTURE DIRECTIONS


User research and further iterations coming soon.