Meta Ends U.S. Fact-Checking Program, Shifting to Crowdsourced Moderation
The decision to replace third-party fact-checking with a community-driven model raises global concerns about misinformation and hate speech proliferation.
- Meta has announced the termination of its U.S. fact-checking partnership with third-party organizations, opting instead for a community-based moderation system similar to Elon Musk's X platform.
- Mark Zuckerberg framed the decision as a move to prioritize free expression, citing frustrations with perceived censorship and over-enforcement under the previous model.
- Critics warn the shift could lead to an increase in misinformation, hate speech, and disinformation globally, especially in countries reliant on Meta's platforms for news and communication.
- Global fact-checking organizations, many of which depend on Meta's funding, fear the U.S. move may signal broader cuts, jeopardizing efforts to combat harmful content in vulnerable regions.
- The change coincides with broader political and business dynamics, including Meta's alignment with the incoming U.S. administration and ongoing debates over content moderation and free speech.











































































