Restoring autonomy to the "in-between" user
DHH individuals who aren't fluent in sign language live between two worlds — excluded from the Deaf community due to language barriers, isolated from the hearing world due to communication fatigue. Existing tools don't solve this. They route through human interpreters, removing agency, or offer static one-size-fits-all captions that fail neurodiverse users entirely.
SignBridge Duo replaces human intermediaries with a context-aware AR system that adapts to the user's environment automatically.
Finding the user
no one designed for
We started with Jessica Kellgren-Fozard, a late-deafened YouTuber who lacks fluency in formal sign language but can't rely solely on lip-reading. She exists in a gap — excluded from the Deaf community, isolated from the hearing world. We called this the "in-between" user, and we found that no existing tool was built for her.
Intelligence, not just text
With the user gap defined, we built a storyboard around Alex — a DHH professional navigating a chaotic airport. High background noise makes lip-reading impossible. He misses a PA announcement about a gate change and only realizes when the crowd moves. This scenario sharpened our design brief: the system needed to adapt to the environment, not the other way around.
Storyboard: Alex's invisible barrier — the moment that defined our design direction
From that storyboard we designed four context-aware modes that switch automatically — no manual toggling, no cognitive overhead for the user.
Beyond mode-switching, we designed Important Information Detection (IID) — a memory layer that detects and temporarily holds critical verbal instructions (deadlines, names, announcements) before they fade, specifically addressing ADHD and APD overlap.
Adaptive context intelligence — automated mode switching with IID memory retention
Full information architecture diagram
From captioning to holistic support
Consulting with Jazmin Cano (Senior UX Research Specialist, Accessibility at Owlchemy Labs) fundamentally shifted our strategy. Three pivots followed that changed the whole scope of the project.
DHH needs often overlap with ADHD and APD. We added memory retention tools and cognitive load indicators that weren't in the original scope.
Replaced photosensitive "red alert" flashes with redundant shape-based cues — triangle warnings, border pulses — that convey urgency without triggering photosensitivity.
Elevated customization to the primary UI layer, treating accessibility controls as core usability requirements rather than buried preferences.
"An impressive example of inclusive design that prioritizes user autonomy. SignBridge Duo bridges the critical gap between raw information and meaningful, accessible communication."
See it in action
The final system delivers an adaptive AR ecosystem that replaces human intermediaries with real-time context-aware support — four modes, IID memory layer, and a fully redesigned safety system for neurodiverse users.
Final design system — all four modes and IID layer
Through SignBridge Duo, I learned that accessibility is not a feature checklist — it's a fundamental architectural framework. By designing for the intersectional needs of DHH and neurodiverse users rather than treating them as a monolith, we built a system that adapts to the human, not the other way around.
The expert consultation was the most formative moment of the project. It taught me that the best design decisions come from admitting what you don't know — and that designing for the margins almost always improves the experience for everyone.
Innovation serves no purpose unless it restores autonomy to those who need it most.