Overview
- Sens. John Curtis and Mark Kelly unveiled the bill to amend Section 230 so platforms can be sued when recommendation engines foreseeably contribute to bodily injury or death.
- The measure imposes a duty of care requiring platforms to exercise reasonable care in the design, training, testing, deployment, operation, and maintenance of recommendation algorithms.
- It applies to for-profit social platforms with more than one million users and preserves chronological feeds, direct search results, and protections against viewpoint-based enforcement.
- The legislation creates a private right of action for victims and allows states to enact similar or stronger protections aligned with the federal standard.
- Supporters cite evidence of radicalization and self-harm linked to algorithmic promotion, while civil-liberty groups like the EFF warn of over-removal of content, and the bill has been introduced but not scheduled for a vote.