Overview
- A peer-reviewed Journal of Neuroscience study from the University of Lübeck tested 25 seven-month-olds and found stronger neural tracking for maternal speech than for strangers’ voices, independent of acoustic properties.
- When infants viewed unfamiliar faces, brain tracking of those faces increased if the simultaneous voice was unfamiliar rather than maternal.
- Facial emotion did not change the pattern, with similar results for happy and fearful expressions.
- EEG temporal-response-function analyses showed reduced central encoding of unfamiliar faces during maternal speech and a link between earlier occipital activity and higher central face-tracking accuracy.
- The researchers report plans to examine how other maternal cues such as smell or touch influence early multisensory social processing.