Particle.news

Download on the App Store

Chinese Academy of Sciences Unveils ‘Brain‑Like’ AI Claiming 100x Speedups on MetaX Chips

A non‑peer‑reviewed study credits selective neuron activation with large efficiency gains on long inputs.

Overview

  • Researchers in Beijing introduced SpikingBrain 1.0, a large language model from the Chinese Academy of Sciences’ Institute of Automation.
  • The team says the model fires only necessary neurons and focuses on nearby context, cutting compute demands versus transformer-style global attention.
  • According to the paper, SpikingBrain achieved comparable results using under 2% of the training data required by conventional models.
  • Reported tests cite 25–100× speed improvements, with some ultra‑long sequences exceeding 100×, including a trial on a 4‑million‑token prompt.
  • The system runs on China’s MetaX chip platform rather than Nvidia GPUs, with a preprint on arXiv and calls for independent benchmarking and peer review.