Particle.news
Download on the App Store

SK hynix Showcases 16-Layer HBM4 and AI Memory Portfolio at CES 2026

The company is using CES to push a shift toward inference‑efficient, system‑level memory designs tied to customer roadmaps.

Overview

  • The company is exhibiting a 16-layer HBM4 with 48GB at its customer booth, describing the part as under development in line with customer schedules.
  • A 12-layer HBM3E with 36GB, presented as the product expected to lead the market this year, is shown alongside GPU modules that use it for AI servers.
  • An AI System Demo Zone features a large cHBM mock-up and demonstrations of PIM/AiMX, Compute-using-DRAM, CMM-Ax, and data-aware CSD to illustrate integrated compute-in-memory approaches.
  • The lineup includes SOCAMM2 for low-power AI servers, LPDDR6 for on-device AI, and a 321-layer 2Tb QLC NAND aimed at ultra-high-capacity enterprise SSDs.
  • SK hynix is prioritizing one-to-one engagement through a customer-only booth, with Yonhap reporting a meeting between the CEO and Nvidia officials in Las Vegas, according to industry sources.