Particle.news

Download on the App Store

Meta Expands AI Accessibility Features for Ray-Ban Glasses and Beyond

New tools include customizable environment descriptions, global volunteer assistance, and advancements in accessible device control.

Illustration of a glob surrounded by symbols associated with common disabilities and communication
Image
Image

Overview

  • Meta has introduced a customizable 'detailed responses' feature for Ray-Ban Meta glasses, offering richer, context-aware descriptions of users' surroundings, now rolling out in the U.S. and Canada.
  • The 'Call a Volunteer' feature, connecting blind and low vision users to sighted volunteers via Be My Eyes, will launch in all 18 supported countries by the end of May 2025.
  • Meta is advancing research on sEMG wristbands to enable accessible human-computer interaction for users with motor impairments, with ongoing trials and collaborations with Carnegie Mellon University.
  • System-wide accessibility tools, including live captions, live speech, and ASL translation, are now available across Meta's XR platforms and messaging apps.
  • Developers have integrated Meta's Llama AI with Sign-Speak technology to create a WhatsApp chatbot that facilitates communication between Deaf and hearing users through ASL translation.