Meta Ends U.S. Fact-Checking Program and Loosens Content Rules
Mark Zuckerberg announces sweeping changes to Meta's content moderation policies, signaling a shift toward less restrictive speech guidelines and closer alignment with the incoming Trump administration.
- Meta has discontinued its third-party fact-checking program in the U.S., citing concerns over political bias and a desire to prioritize free expression.
- The company plans to replace fact-checking with a user-driven 'Community Notes' system, similar to the model used by Elon Musk's X platform.
- Content moderation policies have been scaled back, allowing certain forms of previously prohibited speech, including politically and socially sensitive topics, on Meta platforms.
- Meta has also ended its diversity, equity, and inclusion initiatives, citing legal and cultural shifts, and removed trans-inclusive features and resources from its offices and apps.
- These changes coincide with significant donations from Meta and other tech leaders to Trump's inaugural fund, raising questions about the tech industry's influence and alignment with the incoming administration.















































































































































