New CRAM Technology Slashes AI Energy Use by 1,000 Times
University of Minnesota engineers introduce a breakthrough in-memory computing design, promising unprecedented efficiency for AI applications.
- CRAM (Computational Random-Access Memory) processes data entirely within memory, eliminating energy-intensive transfers.
- The technology uses spintronic devices called Magnetic Tunnel Junctions (MTJs) to enhance speed and resilience.
- Initial tests show CRAM can be up to 2,500 times more energy-efficient and 1,700 times faster than current systems.
- Global AI energy consumption is projected to double by 2026, making CRAM's efficiency gains crucial.
- Researchers plan to collaborate with industry leaders to scale up and integrate CRAM into mainstream applications.