Particle.news

Download on the App Store

Google to Use StopNCII Hashes to Remove Non‑Consensual Intimate Imagery From Search

A phased rollout will rely on victim-submitted hashes, leaving gaps for AI deepfakes or non‑partner sites.

Overview

  • Google announced the partnership at an NCII summit in its London office on Sept. 17 and said it is testing the system with plans to begin using the hashes in the coming months.
  • The company is not yet listed as a StopNCII partner while testing proceeds, with Google citing the need to evolve processes and infrastructure for deployment.
  • StopNCII lets adults create on‑device hashes of intimate images or videos, which participating platforms can match to detect and remove corresponding content without uploading the originals.
  • The system uses PDQ for images and MD5 for videos, and platforms that implement real‑time matching can block reuploads before they appear.
  • Advocates call the move significant yet overdue, noting Google lagged peers like Meta, TikTok, Bumble and Microsoft’s Bing, and warning the approach still misses AI‑generated deepfakes and places much of the burden on survivors.