Overview
- The IT Rules draft published on October 22 defines “synthetically generated information” and mandates prominent permanent labels covering at least 10% of visuals or the first 10% of audio.
- Platforms and AI tools must ask uploaders to declare synthetic content, deploy technical checks to verify it, and embed identifiers in ways users cannot remove.
- Intermediaries that fail to detect, label or act on synthetic media risk losing safe‑harbour protection, while content takedown orders are restricted to senior officials such as joint secretaries and DIG‑rank police.
- Ahead of the Bihar assembly polls, the Election Commission’s October 24 advisory requires parties to label and attribute AI campaign material, keep internal records, and remove misleading deepfakes within three hours of notice.
- MeitY has opened public consultation on the draft until November 6, as experts press for practical carve‑outs for AR/VR, animation and enterprise uses and call for AI literacy to make enforcement workable.