AI Transcription Tool Whisper Faces Criticism for Fabricating Text
Whisper's inaccuracies raise concerns, especially in medical settings, prompting calls for stricter AI regulations.
- Whisper, developed by OpenAI, is reported to produce 'hallucinations,' or fabricated text, in its transcriptions.
- Researchers found hallucinations in 40% of examined audio snippets, with some fabrications including racial and violent content.
- The tool is widely used in various industries, including hospitals, despite OpenAI's warnings against its use in high-risk domains.
- Over 30,000 clinicians and 40 health systems have adopted Whisper-based tools, raising concerns about potential misdiagnoses.
- Experts advocate for federal regulations on AI tools, emphasizing the need for OpenAI to address Whisper's flaws.