Overview
- Creators must submit a government ID and a biometric video to enable likeness detection, which scans YouTube for AI-altered or generated uses of their face and lets them request removals.
- YouTube says biometric data gathered for the feature has never been used to train Google’s generative AI models and is limited to verification and detection, and it will clarify in-product wording without changing the underlying policy.
- The company plans to extend access to more than 3 million YouTube Partner Program creators by the end of January, according to head of creator product Amjad Hanif.
- Rights-management firms Vermillio and Loti advise clients not to enroll, citing the privacy policy’s broad training language and warning that high-quality biometric samples linked to a name could enable more convincing synthetic content.
- Creators currently cannot monetize unauthorized uses of their likeness on YouTube, and while the platform says it is exploring a model similar to Content ID, it previously allowed third-party AI firms to train on creator videos with millions opting in without compensation.