Meta

How We’re Advancing Accessibility at Meta

Takeaways

  • You can now customize Meta AI to provide detailed responses on Ray-Ban Meta glasses, a feature that’s built to assist everyone, but can have particular value to people with disabilities.
  • Our Call a Volunteer feature, which connects blind or low vision individuals to sighted volunteers, will launch in all 18 countries where Meta AI is supported later this month. 
  • We continue to promote accessibility by developing and researching sEMG, bringing accessible features to the metaverse, enabling the creation of accessible software like Sign Speak and more. 

Today, I’m recognizing Global Accessibility Awareness Day by reflecting on Meta’s continued efforts to create products that promote a more accessible future. Building and improving accessibility features helps ensure we deliver meaningful impact for everyone, and I’m proud to share some of our latest developments.  

Helping People Navigate Life Hands-Free

Ray-Ban Meta glasses offer a hands-free form factor and Meta AI integrations — features that help everyone navigate daily life, but can be especially useful to the blind and low vision community. With Ray-Ban Meta glasses, you can capture and share photos, send text or voice messages, make phone calls, take video calls, listen to music, translate speech in real-time, and interact with Meta AI for in-the-moment help. 

Since launching Ray-Ban Meta, people have captured and shared millions of moments with loved ones, and we’ve loved seeing all the different ways people across communities have used them to live more connected lives as we expand their availability across the world.   

Starting today, we’re introducing the ability to customize Meta AI to provide detailed responses on Ray-Ban Meta glasses based on what’s in front of you. With this new feature, Meta AI will be able to provide more descriptive responses when people ask about their environment. This feature will begin to roll out to all users in the U.S. and Canada in the coming weeks and expand to additional markets in the future. To get started, go to the Device settings section in the Meta AI app and toggle on detailed responses under Accessibility.

I’m also excited to share that our Call a Volunteer feature, created in partnership with Be My Eyes, will launch in all 18 countries where Meta AI is supported later this month. Call a Volunteer connects blind or low vision individuals to a network of sighted volunteers in real-time to help them complete their everyday tasks.

Designing for Better Human-Computer Interaction

Wristband devices can facilitate human-computer interactions (HCI) for people with diverse physical abilities, including those with hand paralysis or tremor. We’re exploring these capabilities through our work to develop sEMG (surface electromyography) wristbands at scale for on-the-go interactions with computing systems. Wristbands that use sEMG, or muscle signals, as a form of input are particularly promising for accessible HCI. This is because muscle signals at the wrist can provide control signals even if someone can’t produce large movements (due to a spinal cord injury, stroke or another disabling event), experiences too much movement (due to tremor), or has fewer than five fingers on their hand.  

The sEMG wristband used for Orion, our AR glasses product prototype, is our latest iteration of this technology. As part of our journey to develop sEMG wristbands for a diverse range of people, we’ve been investing in collaborative research that focuses on accessibility use cases.

In April, we completed data collection with a Clinical Research Organization (CRO) to evaluate the ability of people with hand tremors (due to Parkinson’s and Essential Tremor) to use sEMG-based models for computer controls (like swiping and clicking) and for sEMG-based handwriting. We also have an active research collaboration with Carnegie Mellon University to enable people with hand paralysis due to spinal cord injury to use sEMG-based controls for human-computer interactions. These individuals retain very few motor signals, and these can be detected by our high-resolution technology. We are able to teach individuals to quickly use these signals, facilitating HCI as early as Day 1 of system use.

Removing Barriers to Communication

We’re working to make the metaverse more accessible by providing live captions and live speech in our extended reality products. Live captions work by converting spoken words into text in real-time, allowing users to read the content as it’s being delivered. This feature is available at the Quest system level, Meta Horizon call level and in Meta Horizon Worlds.

Live speech converts text into synthetic audio, providing an alternative means of communication for people who may struggle with verbal interactions, or who prefer not to use their voice. Since launch, we’ve observed extremely high retention of the live speech feature, and have since rolled out enhancements, including the ability to personalize and save frequently used messages. 

GIF of a person signing into their device and SignSpeak avatar responding

We’re also excited by the ways Llama, our collection of open source AI models, is being used to promote accessibility. Developers at Sign-Speak have paired their API with Llama to create a WhatsApp chatbot that translates American Sign Language (ASL), facilitating communication between Deaf people and hearing people. With this software, a Deaf person can sign ASL into a device, and the software will translate ASL into English text for the hearing person. The hearing person can message via voice or text to the device, and the software will sign to the Deaf person through an avatar.  

We’re committed to investing in features and products that make connection easier for all, and we’ll continue to evolve to address the needs of the billions of people around the world who use our products.



To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy