Meta Buries Third-Party Fact-Checking Program, Replaces It With Community Notes Akin To X
Meta’s Chief Global Affairs Officer said, “We’ve seen this approach work on X – where they empower their community to decide when posts are potentially misleading and need more context.”
Meta is burying its third-party fact-checking program and borrowing a feature from Elon Musk-owned X to replace it. There are high chance that you would have guessed already which feature we are talking about. Facebook and Instagram owner Meta is scrapping its third-party fact-checking system and bringing the Community Notes program in its place, it will be similar to the Community Notes that we see on X.
Meta made this announcement, and the reasoning given behind this is that Meta believes expert fact-checkers have had their own biases, and too much of the content was being fact-checked due to those biases, reported the Associated Press.
ALSO READ | New Orleans Truck Attack: FBI Reveals Suspect Used Meta Smart Glasses Months Beforehand To Do THIS
Meta’s Chief Global Affairs Officer Joel Kaplan, in a blog post, said, “We’ve seen this approach work on X – where they empower their community to decide when posts are potentially misleading and need more context.”
Meta To Ease Out On Restrictions Too?
The social media giant announced its intention to adjust its content moderation policies, emphasising a shift towards promoting "more speech" by relaxing restrictions on certain topics that are widely discussed in mainstream conversations. This strategic change aims to prioritize the enforcement of rules against illegal activities and "high severity violations," including terrorism, child exploitation, and drug-related content, rather than overregulating less critical issues.
Meta admitted that its existing approach, which relies on intricate systems to manage content across its platforms, has become overly complex. The company acknowledged that this has led to frequent mistakes, resulting in excessive censorship and the removal of content that did not necessarily warrant such action.
CEO Mark Zuckerberg attributed this policy shift partly to the outcome of Donald Trump's presidential election victory. He noted that the political landscape highlighted the need for a more balanced approach to content moderation, one that supports free expression while addressing truly harmful content. Zuckerberg in a video said, “The recent elections also feel like a cultural tipping point towards tower once again prioritizing speech.”