Particle.news
Download on the App Store

AI Acceleration Intensifies With OpenAICerebras Megadeal, Anthropic’s Cowork, and a Robotics Surge

Together these moves signal a push to scale real-time AI and convert research into broad deployment across software and machines.

Overview

  • OpenAI and Cerebras announced a plan to integrate 750MW of wafer‑scale systems into OpenAI’s inference fleet between 2026 and 2028, with OpenAI highlighting lower latency for real‑time services and external reports estimating the deal at over $10 billion.
  • Reporting on the partnership says Cerebras hardware can deliver major speed gains versus GPU‑based systems, and executives framed compute availability as a direct driver of OpenAI’s revenue capacity.
  • Anthropic introduced Claude Cowork as a research preview for macOS Claude Max subscribers, allowing permissioned access to local folders to read, edit and create files, with the company warning that granted access can enable destructive actions if instructed.
  • Anthropic leaders said Cowork’s code was produced almost entirely by Claude and that the first version was assembled in roughly a week and a half, positioning the tool as a more approachable successor to Claude Code for non‑developer tasks.
  • Talent turbulence continued as Thinking Machines Lab’s CTO Barret Zoph departed and Soumith Chintala was named CTO, while OpenAI said Zoph, Luke Metz and Sam Schoenholz would return; multiple outlets reported allegations of unethical conduct and possible leaks tied to Zoph, which have not been officially confirmed.