A multimodal HMI that bridges the accessibility gap — giving blind and low-vision riders the confidence, safety, and autonomy they deserve.
Imagine you were injured and didn't have a partner or friend to drive you where you needed to go. How would your life be impacted? For 25 million Americans, this is everyday reality — transportation insufficiency stemming from cognitive, sensory, and/or motor impairments.
Without reliable, usable transportation, these individuals are further left behind and isolated from society. Autonomous vehicles promise independent mobility, but most human-machine interfaces are built for sighted users first.
Blind and low-vision riders face barriers at every stage: identity verification, trip confirmation, en-route updates, unexpected events, and safe exit. We set out to design a multimodal interface that restores confidence, safety, and autonomy.
Because fully autonomous vehicles are so new and rapidly changing, we set a publication boundary of 2014 to capture the most accurate, up-to-date research. After keyword searching and abstract filtering, 27 articles were identified and read in full.
We broke down the full ridesharing process with impairment needs included — from boarding to exit. Each task became a study condition to measure trust, navigation accuracy, and satisfaction.
The prototype combined a rear-seat mounted display, a natural-voice external speaker, and high-contrast tactile hardware buttons — designed to work seamlessly together across all six trip tasks.
A between-subject design with N = 24 participants across three conditions — designed to isolate the effect of multimodal audio on trust, satisfaction, and navigation accuracy for visually impaired riders.
Across all three key measures — trust, correct vehicle identification, and navigation — the multimodal condition restored performance for visually impaired riders to near-parity with sighted users. Visual-only was significantly worse.
The current prototype validated the multimodal approach in a controlled Wizard-of-Oz setting. Four directions would move this toward real-world deployment.