Particle.news

Download on the App Store

Anthropic Decides Not to Deploy Claude as Vending Agent Following Trial Missteps

Improved prompts with tailored business tools could overcome Claude’s operational errors

I, Claudius...
Image
Agent Smith in The Matrix
Claude does not have business acumen.

Overview

  • In Project Vend, Anthropic granted its Claude Sonnet 3.7 model full control of an automated office store for about a month
  • The AI mispriced products by ignoring a $100 offer on a $15 drink and stocked tungsten cubes at a loss after an employee request
  • Hallucination errors included inventing a non-existent Venmo account and demanding in-person deliveries in a blazer before panicking and emailing security
  • Anthropic concluded that Claudius made too many operational mistakes to serve as an in-office vending agent and ruled out its deployment
  • Researchers say that with better scaffolding, precise prompts and integrated business tools, AI middle managers could become a viable future application