Particle.news

Download on the App Store

Disney Patent Details AI-Driven Real-Time Projection for Moving Animatronics

The newly published filing outlines a sensor-driven projection approach that synchronizes projected imagery with figure movement via a real-time rendering engine.

Overview

  • U.S. patent application US18/592,863, published as US20250278879A1 on September 4, 2025, describes projecting dynamically rendered content onto mechanically animated surfaces such as animatronic faces.
  • The system determines surface orientation relative to one or more projectors, ingests sensor data from cameras and microphones, and computes frames in real time to keep light-based expressions aligned with motion.
  • The filing details multi-projector blending and live adjustments in angle, scale, and brightness to render fine facial nuances—skin texture, wrinkles, eye motion, shadowing—through light rather than complex mechanisms.
  • Disney cites potential use of AI, including a real-time puppet and possibly large language models, to interpret guest speech or gestures and generate immediate visual and motion responses.
  • Disney frames benefits as shorter build cycles, reuse of digital character assets, simpler parts, and reduced maintenance, while the application remains pending with no announced deployment timeline.