Overview
- Researchers led by Cédric Lenoir at UCLouvain compared auditory and tactile rhythm processing in a controlled study published in the Journal of Neuroscience.
- EEG revealed low-frequency neural oscillations aligned to the beat for sound, whereas touch elicited broader responses without a clear beat representation.
- Participants synchronized more precisely and tapped more steadily to acoustic sequences than to fingertip vibrations.
- The team presented minute-long rhythmic patterns via headphones or a piezo-electric fingertip probe, recording EEG repeatedly and measuring finger taps with a custom device.
- The authors urge follow-up work to test whether long-term musical training or sensory loss might shift how touch supports rhythm perception (DOI: 10.1523/JNEUROSCI.0664-25.2025).