Particle.news

Download on the App Store

Yale Unveils Adaptive Computation Model for Dynamic Human Attention

Tracking tests with moving targets among distractors demonstrated the model’s accuracy in predicting moment-to-moment attentional shifts.

Image
The model also helps make sense of what’s sometimes considered a “human quirk”: the ability to make perceptions of non-task-oriented objects — such as the billboard or the sports car — disappear while crossing the busy street. Credit: Neuroscience News

Overview

  • The adaptive computation framework describes how limited perceptual resources are allocated to prioritize goal-relevant information in dynamic scenes.
  • Researchers conducted computer-based experiments in which volunteers tracked highlighted circles moving among identical distractors to test the model.
  • Model predictions aligned with participants’ sub-second focus changes and subjective assessments of task difficulty across varying distraction levels.
  • Data revealed a computational signature of cognitive effort, linking the model’s resource allocation to perceived exertion during prolonged attention tasks.
  • Yale team members Ilker Yildirim and Mario Belledonne published their findings in Psychological Review on June 26 and 27, highlighting potential applications for more human-like AI systems.