Meta Ends Fact-Checking Program in Shift to User-Based Moderation
Mark Zuckerberg announces a major overhaul of Meta's content moderation policies, replacing professional fact-checking with a crowdsourced system inspired by X's Community Notes.
- Meta will phase out its third-party fact-checking program in the U.S., ending a partnership that involved 90 organizations across 130 countries since 2016.
- The company will adopt a user-driven 'Community Notes' system to flag and annotate misinformation, modeled after a feature on Elon Musk's X platform.
- Meta's policy changes include loosening content filters and scaling back AI-powered moderation, focusing on high-risk content like terrorism and child endangerment.
- Critics argue the move aligns with the incoming Trump administration and reflects a broader shift by Zuckerberg to appeal to right-wing political interests.
- The decision has sparked concerns about increased harassment, misinformation, and the platform's ability to address issues like bullying and hate speech effectively.


















































































































































































































































































































































