Show and Tell
An Alexa product identification feature on Amazon’s Echo Show devices to help blind and low vision customers identify common fridge and pantry items.
My Role
UX Designer, Conversation Designer
Device
Echo Show
Year
2019
Team
Alexa for Everyone
The Problem
Identifying pantry items is hard for people who are blind or have low vision. Manual methods are slow and rely on memory, which gets harder with age. Tech tools like PenFriend could only identify certain products, were costly, and easy to lose. Newer apps like Seeing AI and Aira were just arriving but often needed a subscription.
Our Opportunity
Amazon’s Echo Show Device provided an inexpensive, multipurpose product identification device that never ran out of batteries or got lost. It had access to the entire Amazon catalog of goods and a camera that could more than just scan barcodes.
“Alexa, what am I holding?”
“It looks like Whole Foods 365 brand Turmeric”
My Role
I served as the Multi Modal and Conversational Designer for this project. I worked closely with researchers, engineers, computer vision scientists, sound designers, and our product partners. We also partnered with the Vista Center for the Blind to help provide participants for our many rounds of research.
I was responsible for:
0 to 1 UX and Conversation Design: Created end to end user flow and extensive documentation of our VUI components
Scientific Literature Review: Conducted extensive research into how blind and low vision individuals navigate the physical world and adapted VUI guidelines accordingly
Cross Team Collaboration: Worked extensively with sound design and computer vision science teams to reduce complexity and streamline experience
Challenges Faced
Different Users, Different Needs
Blind and low vision users interact with technology, and their physical world, differently than sighted users. I was regularly confronted with my own biases as a sighted user that I had built into my first iteration. I leveraged my skills as a researcher to look into the cognitive and physiological differences of our users, as well as to analyze their physical movements during our prototype testing. This allowed me adapt our experience to meet their specific needs.
Shifting Paradigms Takes (My) Effort
I was taking the identifier out of the user’s hand and putting it on the counter. This was a new approach to a daily task, which required me to find a balance between providing enough details for success and not overwhelming users with new information. I also needed to introduce graceful error handling that guided the user to success without making them feel at fault for the error.
Finding the Right Success Metric
I noticed that the initial success metrics prioritized the tech performance over the user experience. The metrics required the tech to identify an item perfectly, while the user simply wanted to know if they were holding a can of Coke or a can of Sprite. I introduced a Confidence Rating and a new scenario to our user research, which realigned our goals with the user’s practical needs and helped us successfully clear our launch hurdles.
There’s more to this story
This overview only scratches the surface of my work on this project. I can provide the full story from problem to solution, including iterations, strategy, and research, one-on-one.
In the meantime, you can learn more about Show and Tell here.