Particle.news

Download on the App Store

Consumer Reports Flags Security Gaps in AI Voice Cloning Tools

A new investigation finds most voice cloning software lacks safeguards to prevent misuse, raising concerns over fraud and disinformation risks.

  • Consumer Reports evaluated six leading AI voice cloning tools and found that four lacked effective safeguards to prevent unauthorized voice cloning.
  • Only Descript and Resemble AI implemented additional security measures, such as real-time audio verification or consent statements, though these measures were not foolproof.
  • Voice cloning technology has been exploited in scams, including impersonating family members to solicit money and creating disinformation during elections.
  • Consumer Reports recommended stronger protections, such as requiring unique scripts for voice cloning, watermarking AI-generated audio, and verifying user identities through credit card information.
  • The report highlights the lack of federal regulations governing generative AI technologies, leaving companies to self-regulate the ethical use of their tools.
Hero image