Overview
- Qualcomm introduced the AI200 for 2026 and the AI250 for 2027, offering accelerator cards and full liquid‑cooled rack systems and setting an annual release roadmap.
- The products focus on running pre‑trained models rather than training, with rack configurations comparable in power draw to rivals at about 160 kilowatts.
- Qualcomm says its cards support up to 768 gigabytes of memory and are designed to reduce total cost of ownership for inference workloads.
- Saudi‑backed AI startup Humain is the first customer, planning deployments totaling up to 200 megawatts starting in 2026.
- Qualcomm’s stock jumped roughly 15% to 22% on the announcement, positioning the company as a new contender in a market still dominated by Nvidia GPUs.