Particle.news
Download on the App Store

AI Trained on One Person’s Brain Activity Identifies Colors Seen by Another

The experiment aligned retinotopic maps across people to reveal conserved color signatures.

Overview

  • Researchers at the University of Tübingen report in JNeurosci that group-trained models decoded which of three hues and two brightness levels a left-out participant viewed.
  • Fifteen adults underwent fMRI while seeing red, green or yellow stimuli, and a linear classifier using a leave-one-out design predicted both color and luminance from brain activity.
  • The team built a common response space from retinotopic mapping, allowing comparisons across individuals without relying on each person’s anatomy.
  • Visual areas showed consistent, region-specific biases across the visual field, with central locations tending toward yellow preferences and the periphery toward red.
  • The authors stress that shared neural patterns do not prove identical subjective experiences and note limits including small sample size, narrow stimulus set and fMRI’s coarse resolution.