Overview
- Nvidia has moved its AI servers from DDR5 to LPDDR to cut power use, repurposing a memory type commonly used in phones and tablets.
- Counterpoint forecasts that server-memory prices will double by late 2026 as LPDDR demand from servers accelerates.
- Each AI server consumes far more LPDDR chips than a handset, creating a surge in orders that research firms say the industry is not prepared to handle.
- Major suppliers including Samsung Electronics, SK Hynix and Micron have already tightened output of older DRAM to prioritize high-bandwidth memory for AI accelerators.
- Analysts warn chipmakers may divert more capacity to LPDDR to meet Nvidia’s needs, pushing tightness up the market and raising data‑center costs for cloud providers and AI developers.