Overview
- Stanford researchers reported in Cell that implanted microelectrodes plus AI decoded imagined speech in four people with severe paralysis with up to 74% accuracy across a 125,000-word vocabulary.
- The team recorded activity in the motor cortex during attempted and internal speech and showed the patterns overlap yet remain distinguishable for reliable decoding.
- To address privacy, the Stanford system demonstrated a password-style mechanism that prevents decoding of internal speech unless a user-specific keyword unlocks it.
- A UCLA study in Nature Machine Intelligence used noninvasive EEG with an AI ‘copilot’ that improved cursor control and enabled a participant with partial paralysis to complete a robotic-arm task with 93% success after failing with a conventional interface.
- The UCLA system pairs a hybrid EEG decoder with computer-vision and task-context assistants, shares data and code openly, and targets faster, more precise control through expanded training and optimization.