Overview
- The Standing Committee asked the government to build legal and technological tools to identify and prosecute purveyors of AI-generated misinformation.
- It recommended exploring licensing for AI content creators and mandatory labels on AI-made images, videos and articles.
- The panel called for coordination between the Information and Broadcasting Ministry, MeitY and other departments to implement a common framework.
- The report records that MeitY has set up a nine-member group on deepfakes and is funding projects for fake speech and deepfake detection.
- Ministries cautioned that AI cannot perform full fact-checking today and should be used to flag suspect content for human review, while the committee also urged media outlets to maintain fact-check units and internal ombudsmen.