Overview
- Heber City police were testing Axon’s Draft One and the startup Code Four in December when a mock traffic-stop report claimed an officer transformed into a frog.
- Police identified Draft One, which uses OpenAI’s GPT models to generate narratives from body-camera audio, as the system that pulled movie dialogue into the draft.
- Reviewers flagged numerous inaccuracies that required substantial corrections, even as Sgt. Rick Keel reported saving six to eight hours of paperwork each week.
- The department continues its pilot with added oversight and has not decided whether to keep using Draft One.
- The incident has intensified concerns from experts and advocates about automation errors, weak audit trails, and accountability in AI-generated police records.