Particle.news

Download on the App Store

Multisensory Model Explains Eel-Like Locomotion, Backed by Robot and Animal Tests

The study shows coordinated movement persists after spinal transection through stretch plus pressure feedback.

Overview

  • The peer-reviewed paper in PNAS presents a segmented neural-circuit model that fuses stretch and pressure sensing with local central pattern generators to coordinate motion.
  • Simulations and an eel-like robot using the same controller demonstrated stable swimming, terrestrial crawling and navigation around obstacles.
  • Stretch sensing was essential in robot land trials for pushing against obstacles to generate forward thrust.
  • Observations of real eels and spinal-transection experiments support synchronization of body waves without input from the brain.
  • A collaboration among Tohoku University, EPFL and the University of Ottawa, supported by the Human Frontier Science Program, points to design principles for adaptable robots in confined or complex settings.