Particle.news
Download on the App Store

AI Method Delivers First Milky Way Simulation of 100 Billion Individual Stars

A deep-learning surrogate for supernova physics let researchers bypass tiny timesteps to achieve over 100× speedups, with results validated on Japan’s Fugaku and Miyabi supercomputers.

Overview

  • Led by Keiya Hirashima at RIKEN with partners at The University of Tokyo and Universitat de Barcelona, the simulation tracks the evolution of the Milky Way’s stars over 10,000 years.
  • The hybrid design couples a trained deep-learning surrogate for post-supernova gas evolution with conventional numerical solvers to preserve fine-scale physics.
  • Runtime dropped to 2.78 hours per 1 million years of galaxy evolution, implying roughly 115 days for a 1‑billion‑year run versus about 36 years using prior methods.
  • The team reported individual-star resolution about 100 times higher and performance more than 100 times faster than earlier state-of-the-art approaches.
  • The work, presented at SC ’25, ran across roughly 7 million CPU cores and was cross-checked against large-scale runs on the Fugaku and Miyabi systems, with the approach flagged for potential use in climate, weather, and ocean modeling.