App Development

Vocable App on Vision Pro: A New Era of Accessibility for Disabled Users

As a software engineer in AI, it’s rewarding to work on projects that apply artificial intelligence in ways that purely, clearly help people. Vocable — a free augmentative and alternative communication (AAC) iPhone and iPad app that helps speech-impaired patients communicate with their caregivers — is that kind of a project.

The WillowTree team developed Vocable to help one of our own: former WillowTree designer Matt Kubota’s partner Ana was diagnosed with Guillain-Barré Syndrome. The rare autoimmune disorder attacked her peripheral nervous system, causing paralysis in the arms, legs, and face, leaving Ana unable to speak.

Ana and her caregivers initially looked to existing augmentative and alternative communication (AAC) devices, but the available options were either expensive and bulky, or else rudimentary and outdated (at first, she could only painstakingly communicate by blinking as caregivers pointed to individual letters on an alphabet poster).

Through head and face tracking combined with integrated conversational AI, we developed the Vocable iOS app to help nonverbal and nonspeaking individuals like Ana communicate with those around them. It’s been a powerful experience watching the app change people’s lives as they interact with the world in new ways.

Now, on the Vision Pro, Vocable has evolved into something even more powerful. Spatial computing, combined with the many accessibility features Apple gives us, brings disabled users’ virtual and physical worlds together in a singular layer. At the same time, it gives users new and familiar options for how to interact with both of those worlds.

After the first iPhone was launched in 2007, WillowTree was proud to be one of the first developers to release digital products on the App Store. We’re equally proud that Vocable is one of the first apps optimized and available on the Vision Pro App Store.

Vision Pro Gives Vocable an Accessibility Factor Unlike Any Other Device

The Vision Pro has an impressive amount of accessibility options built into it, starting with eye tracking as the default for navigation.

Also by default, users can click by tapping their index and thumb together, but Vocable leverages Apple’s advanced accessibility features like Dwell Control, which allows paralyzed and immobilized users (or anyone who can’t use their hands) to navigate and click by blinking or dwelling on a portion of the interface. Head and hand tracking also bring enhanced accessibility, such as the Pointer Control feature where the finger or wrist can be used like a laser pointer.

This means on the Vision Pro, speech-impaired individuals and users with other disabilities have greater agency over how they use Vocable and, therefore, greater ease in communicating with their caregivers.

Soundboard of the Vocable AAC app as it appears on the Apple Vision Pro

The Vision Pro also layers Vocable’s interface — its custom menus of words and phrases — directly into the users’ field of vision. Rather than focusing on a separate screen or device, Vocable is superimposed into their first-person POV.

This makes viewing and selecting responses happen more smoothly than using any other "separate" AAC device, where constraints such as screen size and fewer accessibility options (e.g., just head and face tracking) create bottlenecks. It also allows communication-vulnerable users to remain visually engaged in the world around them.

A More Conversational Experience for Speech-Impaired Patients and Caregivers

Thanks to Vision Pro’s enhanced accessibility features, Vocable now brings speech-impaired individuals and their caregivers closer to the experience of ordinary conversation.

  • A default and customizable Soundboard presents users with pre-populated words and phrases to select.
  • Users or caregivers can also quickly and easily activate Listening Mode by tapping the mic icon or saying, “Hey, Vocable.”
  • The app will then detect other people’s speech.
  • By enabling Vocable’s Smart Assist feature, an integration with OpenAI’s ChatGPT, Vocable can then understand and retain this conversational context and generate potential responses for the wearer to choose from.
  • Users may then glance at a greater range of responses and pick one, allowing conversations to flow more quickly and naturally.
  • Vocable then utilizes the Vision Pro’s built-in Speech Synthesis to audibly play back the user’s selection. In an upcoming release, Vocable will utilize Apple’s Personal Voice feature, allowing users to respond “in their own voice.”
Vocable AAC app presenting Apple TV+ show options on the Vision Pro


Users Seamlessly Orchestrate Vocable with Other Apps

Just as Vocable makes communication flow more smoothly between speech-impaired people and their caregivers, the Vision Pro makes it easy for patients to flow between their physical and digital environments in one singular space.

Vision Pro essentially expands the typical laptop or desktop monitor to near-infinite real estate. That means Vocable’s semi-transparent Soundboard interface can be placed alongside other apps and windows, letting speech-impaired users communicate with others while they browse Safari, listen to podcasts, or stream videos. Wearers simply place Vocable in their virtual space, shifting their core focus to various apps or the physical world on the other side of the Vision Pro glass.

When a caregiver enters the room to ask, “What would you like for dinner?” the user can respond and then return to whatever they were doing before. This provides disabled users the same kind of spatial flexibility that an executive might enjoy while using the Vision Pro as a wearable workspace.

Is Your App Ready for the Vision Pro?

Skepticism about the Apple Vision Pro is understandable. It represents a convergence of new technology, unlike anything the world has seen in recent years, and the pricetag is certainly a consideration. But from a software engineering perspective, there’s so much to be excited about — and lots of reasons to be optimistic about the Vision Pro’s staying power.

To start, the technology Apple has brought to market is so far ahead of existing AR/VR headsets that the Vision Pro looks likely to reset the space. Not to mention, Apple has been seeding the technologies built into the Vision Pro for years now, showing its deep and wide-ranging potential for future applications.

That said, how the Vision Pro will evolve as Apple better understands the needs of enterprise versus consumer users probably won’t be as linear as, say, the iPhone. Businesses need to be ready with versions of their apps optimized to work across Apple’s evolving ecosystem.

If you’re ready to optimize your existing app for Vision Pro, or develop a new digital experience specifically for spatial computing use cases, WillowTree’s Vision Pro Accelerator can help. Get started by exploring our digital product and delivery services and reach out to learn more about how we can migrate and customize your mobile app to the Vision Pro.

Table of Contents
Read the Video Transcript

One email, once a month.

Our latest thinking—delivered.
Thank you! You have been successfully added to our monthly email list.
Oops! Something went wrong while submitting the form.
More content

Let's talk.

Elegant, Performant Digital Products.
Personalized, Automated Marketing.
The Frontiers of Data and Generative AI.