Particle.news
Download on the App Store

Broadcom to Embed Camb.AI Voice and Translation Models in Consumer SoCs

Real-world performance remains unproven with no device timeline set.

Overview

  • The partners plan to run Camb.AI speech, translation, and text-to-speech models directly on Broadcom chips using on-device NPUs.
  • They claim ultra-low latency, reduced wireless bandwidth use, and support for more than 150 languages through local processing.
  • Manufacturers could add built-in translation, screen readers, and voice commands to smart TVs, set-top boxes, routers, and home assistants.
  • A tightly edited demo using a Ratatouille clip showed dubbing and audio description, yet independent accuracy in real-world scenarios has not been verified.
  • The project remains in testing with no rollout schedule, as Broadcom expands its AI efforts alongside work with OpenAI and Camb.AI points to collaborations with Comcast/NBCUniversal, IMAX, NASCAR, and Eurovision plus $18.5 million in funding.