Meta CEO Mark Zuckerberg unveiled significant changes to the company’s moderation policies on Tuesday, citing a shifting political and social climate and the need to restore free expression across its platforms. The updates will impact Facebook, Instagram, and Threads, which together serve billions of users globally.
Meta will discontinue its existing fact-checking program, which relied on partnerships with third-party organizations, and implement a community-driven system similar to X’s Community Notes, according to a report by NBC News.
“We’re going to get back to our roots and focus on reducing mistakes, simplifying our policies, and restoring free expression on our platforms,” Zuckerberg said in a video message. “First, we’re going to get rid of fact-checkers and replace them with community notes similar to X, starting in the U.S.”
In addition, Meta will modify its content moderation policies, particularly around political topics. Changes that previously reduced political content in user feeds will be undone. Zuckerberg highlighted the U.S. election as a pivotal factor influencing these decisions, criticizing what he described as pressure from “governments and legacy media” to increase censorship.
“The recent elections also feel like a cultural tipping point towards, once again, prioritizing speech,” he said.
Zuckerberg acknowledged that the complex systems Meta had developed to moderate content were prone to errors, impacting millions of users.
“We built a lot of complex systems to moderate content, but the problem with complex systems is they make mistakes,” he said. “Even if they accidentally censor just 1% of posts, that’s millions of people, and we’ve reached a point where it’s just too many mistakes and too much censorship.”
While Meta will continue to strictly moderate content related to drugs, terrorism, and child exploitation, the company plans to ease some policies surrounding sensitive topics like immigration and gender. Automated moderation systems will now focus on "high severity violations," relying more on user reports for less severe issues.
“We’re also going to tune our content filters to require much higher confidence before taking down content,” Zuckerberg explained. “The reality is that this is a trade-off. It means we’re going to catch less bad stuff, but we’ll also reduce the number of innocent people’s posts and accounts that we accidentally take down.”
The decision to end the fact-checking program marks a departure from Meta’s earlier efforts, launched in 2016, which involved third-party fact-checkers certified by organizations like the International Fact-Checking Network (IFCN). Over 90 organizations participated, fact-checking content in more than 60 languages.
Meta’s shift mirrors broader trends in the social media industry, where companies have increasingly scaled back on moderation efforts amid criticism of bias and politicization. Conservatives, in particular, have long accused Meta’s fact-checking system of favoring liberal viewpoints—a claim that has been disputed.
X’s Community Notes, the model for Meta’s new system, has gained popularity among conservative users for its mix of fact-checking and community-driven contributions.
Zuckerberg’s announcement comes as social media companies navigate a politically charged environment. The NBC News report notes that Meta, like other tech giants, has sought to align with incoming political leadership. The company donated $1 million to President-elect Donald Trump’s inaugural fund, and Zuckerberg praised Trump in an interview before the election, though he stopped short of an endorsement.
Bd-pratidin English/Lutful Hoque