Meta announced Tuesday that it is abandoning its third-party fact-checking programs on Facebook, Instagram and Threads and replacing its army of paid moderators with a community notes model that mimics X. much maligned volunteer programWhich allows users to publicly flag content they consider inaccurate or misleading.
In a blog post Announcing the news, Meta's newly appointed Chief Global Affairs Officer Joel Kaplan said the decision was made to allow more topics to be discussed openly on the company's platforms. This change will first impact the company's moderation in America.
Kaplan said, “We will allow more speech by lifting restrictions on certain topics that are part of mainstream discourse and focus our enforcement on illegal and high-serious violations.” However, he did not specify which topics these new rules would cover.
In a video accompanying the blog post, Meta CEO Mark Zuckerberg said the new policies would see more political content in people's feeds, as well as posts on other issues that have stoked the culture wars in the US in recent years. .
Zuckerberg said, “We're going to simplify our content policies and get rid of restrictions on topics like immigration and gender that are out of touch with mainstream discourse.”
Meta has rolled back significantly Fact-checking and getting rid of content moderation policies that were implemented in 2016 in the wake of revelations of influence operations conducted on its platforms that were designed to influence elections and in some cases incite violence. even genocide,
Ahead of last year's high-profile elections around the world, the meta was Criticized for adopting a careless attitude For content moderation related to those votes.
Echoing comments Mark Zuckerberg made last yearKaplan said that Meta's content moderation policies were implemented not to protect users but “partly in response to social and political pressure to control content”.
Kaplan also blamed fact-checking experts for their “biases and viewpoints” that led to over-moderation: “Over time we fact-checked too much content that people would consider legitimate political speech and debate,” Kaplan wrote.
Although WIRED reported last year that dangerous content like medical misinformation Has flourished on stage while groups like Anti-government fighters have used Facebook Recruiting new members.
Zuckerberg, meanwhile, blamed “legacy media” for forcing Facebook to implement content moderation policies in the wake of the 2016 election. “After Trump was first elected in 2016, legacy media consistently wrote about how misinformation is a threat to democracy,” Zuckerberg said. “We tried, in good faith, to address those concerns without becoming the arbiters of truth, but fact checkers have been too politically biased and have destroyed more trust than they built “
In what he outlined as an effort to address the bias, Zuckerberg said Meta's in-house trust and security team would be moving from California to Texas, which is also now home to X's headquarters. Zuckerberg said, “As we work to promote free expression, I think this will help us build trust to do this work in places where there are less concerns about bias on our teams. “