Meta Abandons Third-Party Fact-Checking, Shifts to Community-Led Moderation

Elliot Kim

Elliot Kim

January 08, 2025 · 3 min read
Meta Abandons Third-Party Fact-Checking, Shifts to Community-Led Moderation

Meta, the parent company of Facebook and Instagram, has announced a significant overhaul of its content moderation policies, abandoning its third-party fact-checking program in favor of a community-led approach. The move, which comes just two weeks before President-elect Donald Trump's inauguration, has sparked concerns about the potential spread of misinformation on the platform.

Under the new policy, Meta will rely on "Community Notes" – a crowdsourced approach to content moderation similar to that used by X – to identify and review misinformation. The company claims that this approach will be less prone to bias and more effective in providing users with accurate information. However, experts warn that the shift away from professional fact-checking could allow disinformation and hate to spread more easily online and into the real world.

The changes are part of a broader effort by Meta to prioritize "More Speech and Fewer Mistakes" on its platform. According to Joel Kaplan, Meta's new policy lead, the company will focus on preventing over-enforcement of its content policies and less on mediating potentially harmful but technically legal discussions. The updated Hateful Conduct policy, for example, now allows users to call gay and trans people "mentally ill," while an explicit ban on referring to women as "household objects" has been removed.

Meta's decision to abandon third-party fact-checking has been met with criticism from experts and advocacy groups. The company's fact-checking partners, who were informed of the change just 45 minutes before the public announcement, will continue to receive payments until August, and those who haven't signed 2025 contracts could receive severance. The move has also been praised by X CEO Linda Yaccarino, who commended Mark Zuckerberg's decision to adopt a Community Notes-style moderation approach.

President-elect Trump has also weighed in on the issue, claiming that his threats "probably" influenced Meta's decision to change its policies. Trump's FCC chairman pick, Brendan Carr, has been a vocal critic of tech companies' fact-checking programs, and Meta's move may be seen as an attempt to appease the incoming administration.

The implications of Meta's policy shift are far-reaching and could have significant consequences for the spread of misinformation online. As the company moves its trust and safety teams from California to Texas, citing concerns about bias, it remains to be seen how effective the new approach will be in promoting free expression while preventing the spread of harmful content.

In a statement, the Meta Oversight Board, a semi-independent body that interprets Meta's rules and suggests changes, reminded the company that it exists and would like to continue existing in the future, hinting at concerns about the potential impact of the policy shift on the board's role.

As the tech industry continues to grapple with the challenges of content moderation, Meta's decision to abandon third-party fact-checking serves as a significant development in the ongoing debate about the role of technology companies in regulating online speech.

Similiar Posts

Copyright © 2024 Starfolk. All rights reserved.