Meta Overhauls Content Moderation Policies Amid Political Shifts
Meta Platforms, the parent company of Facebook, Instagram, and Threads, announced a significant shift in its content moderation policies on Tuesday. The company is scrapping its US fact-checking program and loosening restrictions on controversial topics like immigration and gender identity. This change comes as President-elect Donald Trump prepares for his second term, and Meta faces pressure from conservative voices.
Key Changes in Moderation Strategy
Meta CEO Mark Zuckerberg described the overhaul as a return to the platform’s roots in free expression. The company will replace its fact-checking program with a “Community Notes” system, similar to the one used on Elon Musk’s platform, X. Proactive scanning for hate speech and other rule-breaking content will also end. Instead, Meta will only review posts flagged by users, while automated systems focus on severe violations such as terrorism, child exploitation, scams, and drug-related content.
“We’ve made too many mistakes and applied too much censorship,” Zuckerberg said, citing the recent US elections as a cultural turning point toward prioritising speech.
Leadership and Organisational Changes
In line with the policy shifts, Meta has elevated Republican policy executive Joel Kaplan as its global affairs head. Dana White, CEO of the Ultimate Fighting Championship and a Trump ally, was recently added to Meta’s board. Additionally, teams handling content policy will relocate from California to states like Texas, though plans for the move have yet to be shared with employees.
Reactions to the Announcement
The decision to end the fact-checking program, established in 2016, surprised partner organisations.
Agence France-Presse (AFP), a partner in the program, called the decision “a hard hit for fact-checking and journalism.” Angie Drobnic Holan, head of the International Fact-Checking Network, disputed Zuckerberg’s claim that fact-checkers were biased or censorious, noting that fact-checking adds context rather than removing content.
Critics argue the move prioritises political appeasement over informed content moderation. Ross Burley, co-founder of the Centre for Information Resilience, warned, “This is a major step back at a time when disinformation is evolving faster than ever.”
Broader Implications
Meta’s changes will initially apply only in the US, where political pressures heavily influence content policies. However, its fact-checking program will continue in regions like the European Union, which enforces stricter regulations under its Digital Services Act (DSA). The DSA mandates that platforms combat illegal content and disinformation or face fines up to 6% of global revenue.
The EU is already investigating Musk’s X platform over its “Community Notes” system. A European Commission spokesperson indicated ongoing monitoring of Meta’s compliance with EU regulations.
What’s Next for Meta
Meta plans to phase in Community Notes across its US platforms in the coming months, with improvements expected over the next year. Despite controversy, Zuckerberg framed the changes as an essential shift toward prioritising free expression, marking a new chapter for Meta’s role in moderating public discourse.
With inputs from Reuters