top of page

Project Cicerone 

Cicerone is an AI-enabled haptic wearable navigation wayfinder headset prototype that helps blind and visually impaired individuals to walk in straight lines effortlessly and to reach destinations independently

  • Role: Lead Prototyper, Product Designer

  • Duration: 3 months 

  • Project Type: User-Centered Design Class Project at UW

Accessible Design | Hardware Product Design |

User-Centered Design

Team Members

Tianxiang (Dave) Cai

 Lead Prototyper, Product Designer

Amodini Khade

Research Lead

Jacob Ybarra

Usability Specialist, Product Designer

Kunal Mehta

Accessibility Specialist

Research Methods

Survey

We had a variety of questions ranging from the tools blind individuals use to navigate their path to the elements in their navigation journey that cause hassle

Interview

Get a deeper understanding of the issues that the survey respondents mentioned.

 

Particularly focus on understanding how the individuals envisioned certain tools or how processes could be improved.

Narrative

Closely observe our participant’s actions as they went about navigating their path.

 

This helped us to witness first-hand some of the challenges that our respondents had in the other two methods mentioned.

Team Cicerone implementing Narrative research on UW Seattle Campus

Team implementing Narrative research on UW Seattle Campus

1  —

Project Summary

Cicerone is a class project for the User-Centred Design course at UW's Human Centered Design & Engineering program.

 

The prompt for this project is "Back to a Future Life Together," which asked us to design a non-app or website that enables people to "do life together." 

The project took place over 12 weeks from October to December in 2022. The team conducted the work in Seattle, WA, and there were several participants from other states and overseas.

2  —

Our Solution

Our team designed a "Wizard of Oz" high-fidelity prototype powered by a bone-conduction headphone, a tablet-controlled MIDI controller, a set of wireless lavalier microphones, and an artificial intelligence human simulation simulation. 

 

The system is complementary to the existing assistive tools like white cane, online maps app, and accessibility features that are built-in modern smartphones.

3  —

Stakeholders

Our primary users are blind and visually impaired individuals.

We also explored how service dogs, helpers, people on the street, and video call services (Aira / Be My Eyes) can be potential stakeholders for our design.

01 - project overview

How can we enable visually impaired and blind individuals to navigate their desired path efficiently and connect with their communities effortlessly?

Research Findings

01

Haptic Feedback

All participants mentioned that current tools lack haptic feedback.

Auditorial sound alerts can be annoying and easily ignored sometimes.

02

"Last Mile"  Navigation

All participants mentioned that current navigation tools do not provide granular details such as distinguishing between two storefronts or the exact room one wants to enter.

Design Goals

  • Complemental to a existing assistive technology

  • Provide the safest path, not the quickest

  • Provide haptic feedback for obstacles

  • Include accurate directions, especially for last-mile navigation and spaces where there is no shoreline to follow

  • Take the impact of noise levels in outdoor environments into account

Cicerone Sketch

Cicerone Concept Design Sketch

Scenarios

Personas

Emerged from interviews and narrative research. We designed our device to help users throughout their daily lives while prioritizing their safety. Although we examined the goal and role-directed desires for each of our fictional personas, we also humanized them. We would often refer back to them and ask ourselves, "what would Sam think of this new iteration? This still helps her on her way to school, right?"

Background

Constraints

Our recruitment pool of individuals who are blind and low-vision was small. Therefore, our fictional personas became the foundation of our user research. Their fictional scenarios continually helped us iterate new ideas when we did not have in-person testers. However, this also limited our scope, and our personas closely resembled our interviewees and user testers. 

prototype

principles 

explained.

Wizard of Oz Prototype

MIDI Controller "Launchpad"

plays pre-recorded auditory and haptic cues

Each cue is manually adjusted interms of loudness, stereo channel, surrounding effects

Bone-conduction headphone

provides realistic haptic simulation, allow users to speak with Cicerone, and keeps ears open

Wireless lavalier mic & audio recorder

provides 2-way, lag-free, internet independent communication between users and Cicerone (Human AI in prototype)

As the lead prototyper, after I holistically evaluated the design requirement and time and tech constraints, I decide to adopt what is called "Wizard of Oz" prototype method for Cicerone.  

 

I chose to build this futuristic system from whatever could make the Minimum viable product (MVP) to work at the greatest extent of our design.

Test 1

Indoor navigation in a hallway with a low-vision participant

Tasks:

  • Shoreline keep assist

  • Detour before obstacle

  • Text Recognition: Vending Machine and Room Sign

Test 2

Controlled outdoor environment navigation in an apartment courtyard with a blind participant

Tasks:

  • Turn-by-turn guidance in an unfamiliar and complex environment

  • Emergency hazard alert

  • Description of the physical features of human beings and objects

  • Public transit simulation

Test 3

Indoor multiple-level navigation with a blind individual

Tasks:

  • Stairs guidance

  • Door knob locating assistance

  • Object detection

  • Detour

  • Door sign and keypad lock assistance

prototype

Usability Test

* Test participants use Cicerone prototype as the complementary assistance to complete certain tasks designed by the team

  • Find an alternative to Zoom as it doesn't relay haptic feedback

  • Give users agency over when they want the surroundings to be described

  • Add a variety of sounds e.g. music, voice commands, and haptic patterns to make navigation less monotonous

  • Simplify device setup

  • Reduce reliance on the internet

insights

from the test

Cicerone uses a LiDAR system to scan the surrounding environment and deploys auditory + haptic feedback to help blind or low-vision users to walk on stairs more safely and confidently.

(Wizard of Oz prototype shown in photo)

Cicerone uses an AI-powered camera that reads text on the objects when the user request by voice

(Wizard of Oz prototype shown in photo)

Cicerone uses haptic feedback to keep users on their desired path, notify users of the obstacles ahead, and provide detour guidance

Co-Design

Cicerone had 2 co-design design sessions with the accessibility expert on the team, who is a blind individual, and this process made sure Cicerone's solution is designed with the users' routine in mind.

Spatial Audio + Haptics 

Lead Prototyper Dave uses editing software to adjust the stereo channel of cicerone 

MIDI Controller as Remote Interface

Export audios and haptic patterns to an iPad MIDI Controller "Launchpad", and customize the interface for different tasks

Human as AI

Due to the tech and time constraints, we were unable to build a real AI assistant with LiDAR, so we used our team members to be our AI.

They will give necessary voice instructions to the users through the bone conduction headset, and we hide them from users.

How I prototype a future technology MVP

no $$$, no emerging tech, less than 10 days

02 - user research 

03 - ideate, design & prototype

04 - test

user flow charts

corresponding to 3 main scenarios

Sketch Ideations

iterations

We iterated our prototype to resolve the following issues from usability testing:

  • More diverse and clearer haptic feedback pattern

  • Less internet dependency

  • Two-way, real-time communication between the user and the Cicerone Staff behind the Wizard of Oz prototype

I think there was a drastic difference

in the first prototype and the final solution

-- User Tester "Aiden"

high-fidelity

prototype

What could be done differently?

corner cases

next steps

  • Building a physical prototype of obstacle avoidance feature using Arduino and ultrasonic distance sensor or even a real LiDAR.

  • Keep engaging with study participants to understand their daily lives better.

  • Complex road condition

  • Weather sealing

  • Ambiguous commands

  • Emergency reaction

  • Bridge the gap between research and design better.

  • In the usability test, test more scenarios and add more challenging tasks.

  • Consider more how different low-vision and visual impairment users do things differently.

Let’s talk about Numbers:

100%

of participants mentioned they do find auditory (sound) notifications can get annoying sometimes.

#6

On average, each participant has tried or currently using 6 types of analog or assistive devices in their daily lives.

1,020,000

number of people who are blind in the US

* data from CDC, 2022

05 - reflection

bottom of page