Experts warn that Meta's decision End your third-party fact-checking programs Can allow disinformation and hatred to spread online and spill over into the real world.
The company announced today that it is phasing out Program Launched in 2016 where it partners with independent fact-checkers around the world to identify and review misinformation on its social media platforms. Meta is replacing the program with a crowdsourced approach to content moderation similar to X's Community Notes.
Meta is essentially putting the onus on users to weed out lies on Facebook, Instagram, threads and WhatsApp, raising fears that misinformation about climate change, clean energy, public health risks and communities often targeted by violence. It will become easier to spread information. ,
“This is going to hurt Meta users first and foremost.”
“This is going to hurt Meta's users first because the program has done a good job at reducing the virality of fake content and conspiracy theories,” says Angie Drobnik Holan, director of the International Fact-Checking Network (IFCN) at Poynter.
“Many people think that Community Notes-style moderation doesn't work at all and is just window dressing so platforms can say they're doing something… Most people don't want to wade through a bunch of misinformation,” says Hollan. , social media, factually checking everything yourself. “The losers here are people who want to be able to go on social media and not be overwhelmed with false information.”
in a videoMeta CEO Mark Zuckerberg claimed the decision was a matter of promoting free speech, while also calling fact-checkers “too politically biased”. Meta also said that its program was too sensitive and that 1 to 2 out of every 10 pieces of content it removed in December were mistakes and would not have actually violated company policies.
Holan says the video was “incredibly unfair” to fact-checkers, who have worked as partners with Meta for nearly a decade. Meta worked exclusively with IFCN-certified fact-checkers, who had to adhere to the network's code of principles as well as Meta's own policies. Fact-checkers reviewed the content and evaluated its accuracy. But when it comes to removing content or limiting its reach, Meta – not the fact-checker – makes the call.
Poynter owns PolitiFact, one of the fact-checking partner Meta operates in the US. Before stepping into her role at IFCN, Hollan was editor-in-chief of PolitiFact. Hollan says that what makes a fact-checking program effective is that it acts as an “accelerator in the path of false information.” A screen is usually placed above content that is flagged to let users know that fact-checkers find the claim questionable and ask if they still want to view it.
Holan says the process covered a wide range of topics, from false information about celebrities dying to claims of miracle cures. Meta launched the program in 2016 with growing public concern about the spread of unverified rumors online on social media, such as False stories about the Pope supporting Donald Trump For President that year.
Meta's decision looks more like an effort Support President-elect TrumpIn his video, Zuckerberg described the recent election as “a cultural tipping point” toward free speech. The company has recently been named Republican lobbyist Joel Kaplan Added as its new Chief Global Affairs Officer UFC CEO and President Dana WhiteTrump's close friend, on its board. Trump also said today that there were changes in the meta “Maybe” in response to his threats,
“Zak's announcement is an attempt to completely capitulate to Trump and hold him [Elon] Musk in his race downstream. The implications are going to be wide-ranging,” Nina Jankowicz, CEO of the nonprofit American Sunlight Project and an assistant professor at Syracuse University who researches disinformation, said in a. Post On BlueSky.
Twitter launches its community moderation program called birdwatch At that time, in 2021, before Musk takes office. Musk, who helped finance Trump's campaign and is now set to lead the incoming administration.Department of Government Efficiency,'' The shift toward community notes came after Twitter removed the teams responsible for content moderation. Hate speech – including slurs against black and transgender people – rose on stage According to research by the Center for Countering Digital Hate, after Musk bought the company. (Musk then sued the center, but a federal judge case dismissed last year,
Advocates are now concerned that harmful content could spread unhindered on Meta's platforms. Imran Ahmed said, “Meta is now saying that it is up to you to spot the lies on its platform, and if you can't tell the difference then it's not their problem, even if those lies, hate or scams harm you. ” the founder and CEO of the Center for Countering Digital Hate said in an email. Ahmed calls it “a huge step forward for online security, transparency and accountability” and adds, “It could have dire offline consequences in the form of real-world harm.”
“By omitting fact-checking, Meta is opening the door to unchecked hateful disinformation about already targeted communities like black, brown, immigrant and trans people, which often leads to offline violence,” said the campaign for nonprofit Kairos. said manager Nicole Sugarman. said in an emailed statement, to combat race and gender-based hatred online. The Verge Today.
Meta's announcement today specifically states that it is getting rid of “many restrictions on topics such as immigration, gender identity and gender that are the subject of frequent political discussion and debate.”
Scientists and environmental groups are also wary of changes in meta. “Mark Zuckerberg's decision to abandon efforts to fact-check and correct misinformation and disinformation means that anti-scientists are no longer on the Meta platform,” Kate Sale, senior climate campaign manager for the Union of Concerned Scientists, said in an emailed statement. “The content will continue to circulate.”
Michael Khoo, director of the climate disinformation program at Friends of the Earth, says, “I think it's a terrible decision… The influence of disinformation on our policies has become more and more apparent.” He points to attacks on wind energy affecting renewable energy projects as an example.
Kho also compares the Community Notes approach to the fossil fuel industry. marketing of recycling As a solution to plastic waste. In fact, recycling is Little effort done to stop the wave of plastic pollution Flooding the environment because the material is difficult to recreate and many plastic products not really recyclableThis strategy also puts the responsibility for dealing with the company's waste on consumers. ,[Tech] Companies need to tackle the problem of disinformation that their own algorithms are generating,'' Khoo explains. The Verge,