It came as no surprise to Natália Leal, executive director of Lupa, one of Brazil's largest fact-checking agencies, when she received the news that Meta would discontinue its fact-checking program in the United States, replacing it with a system similar to the one used on the platform X. Like Elon Musk and other tech magnates, Meta CEO Mark Zuckerberg had already hinted that he would align himself more closely with U.S. President Donald Trump.
“What we didn’t expect was for the change to happen now and so abruptly,” Leal told LatAm Journalism Review (LJR). “Meta’s networks will not only stop having fact-checking but will also cease moderating homophobic, racist, or misogynistic content.”
Zuckerberg’s decision, announced on Jan. 7, is currently limited to the United States. Last Monday (Jan. 20), Meta’s head of global business, Nicola Mendelsohn, said at the World Economic Forum in Davos that the company would continue to use fact checkers in other countries “for now,” and that such a decision would depend on the consequences of the changes in the United States.
Professional fact checkers in Latin America expressed skepticism and said they see a likelihood that Meta will eventually discontinue its fact checking partnerships in the rest of the world.
If eventually confirmed, the end of Meta's support would mean the loss of a significant source of revenue for fact-checking agencies, one of the fastest-growing areas of journalism over the past decade. It would also mean that false and misleading information would circulate more freely on social networks like Instagram and Facebook, from political propaganda to financial scams.
For Leal, these changes are symptomatic of profound political transformations.
“There’s a certain attack on democracy as an established model, which is based on dialogue and consensus-building,” she said. “There is an ongoing paradigm shift, with an overvaluation of freedom of expression to the detriment of other rights, such as the right to be who you are and the freedom to express your identity without suffering attacks.”
Meta, formerly Facebook, launched its fact-checking program in the first half of 2018, after the British consulting firm Cambridge Analytica harvested millions of Facebook users’ personal data for advertising in the 2016 U.S. presidential elections.
The program operates through partnerships between fact-checking organizations and Meta. All fact-checking organizations are connected to the International Fact-Checking Network (IFCN), a network organized by the Poynter Institute that establishes a series of criteria and methodologies for its members. According to Meta, more than 80 organizations working in 60 different languages participate.
The organizations identify potential misinformation cases through their own monitoring or suggestions from Meta. These contents are classified based on the organizations' truthfulness criteria, and Meta is notified that certain content is false or misleading.
“But that’s where we stop.After that, it’s up to Meta to decide what to do with this classification,” Arturo Daen, editor of Sabueso, the fact-checking division of Mexico's Animal Político, told LJR. “They’re the ones who decide whether to simply add a warning that the content is misleading or false, or to determine the level of exposure for that content.”
All fact-checking organizations confirm that decisions about content removal, user penalties, or reducing content reach rest solely with Meta.
“Fact-checkers have no power other than placing warnings in Meta’s internal system,” said Natália Leal. “We don’t even know how Meta uses the information we provide to make decisions.”
Until Zuckerberg’s announcement, the relationship between Meta and the agencies was very positive, according to fact checkers. The Brazilian site Aos Fatos was one of the first organizations in the region to join the partnership in 2018.
“They were always very receptive and even grateful for our work,” Aos Fatos director and co-founder Tai Nalon told LJR. “Meta has always shown a great deal of concern during crisis situations, such as natural disasters, the events of January 8, or scenarios requiring quick fact-checking responses.”
Zuckerberg’s suggestion in his speech that fact-checkers censor content was one of the most displeasing parts of his statement for fact checking professionals.
“In his statement, Mark accuses fact-checkers of supposedly contributing to censorship, but that’s not the case,” Daen said. “We contribute to improving the quality of information in the debate, but we don’t decide whether or not certain content is removed.”
Potential Financial Cuts
If Meta stops supporting fact-checking beyond the United States, it will cause significant financial losses for its partner organizations.
“The importance of this partnership for the sustainability of the fact-checking ecosystem is undeniable,” Nalon said. “There hasn’t been a platform with a scale and an anti-disinformation program as robust as Meta’s. The partnership represented a paradigm shift in digital fact-checking starting in 2016.”
Managers at the organizations say they have other sources of revenue but do not deny the importance of Meta’s funding. According to Natália Leal, the amount paid to organizations varies by country, and negotiations may vary by organization.
“For us, up to 20% of our resources come from the partnership,” Leal said. “But Lupa is an exception in the fact-checking universe. For some organizations, 80%, 90%, or even 100% of their resources come from the Meta partnership.”
According to Olivia Sohr, director of impact and new initiatives at Argentina’s Chequeado and coordinator of the regional fact-checkers network LatAm Chequea, which includes 47 organizations in 21 countries, about half the network members have partnerships with Meta. Even if many organizations survive a potential end to the program, some initiatives will likely be discontinued.
“Especially the fact-checking divisions in larger organizations are vulnerable,” Sohr told LJR.
Meta’s exit generates additional concern: The fact-checking sector receives little funding from other journalism funders, such as philanthropy, due to the perception that their resources come from digital platforms with deep pockets, Nalon said.
“There isn’t a consolidated ecosystem of donors for fact-checking,” Nalon said. “That’s the most worrying part. It’s unclear how to create alternative funding models beyond digital platforms. It’s necessary for foundations, private companies, and donors to step in to fill the gap.”
Meta still says on its website that “the program is operational and that people value the warnings we apply to content after a fact-checking partner has classified it.” According to the company, after interviewing users who saw platform warnings, “74% felt they had seen the right amount or were open to seeing more warnings about false information. From these, 63% considered the classifications accurate.”
The Latin American country that reacted most strongly to Meta’s announcement was Brazil. The Brazilian attorney general’s office requested explanations from the company and convened a public hearing this Wednesday (Jan. 22) for Meta officials to clarify their policy changes.
The attorney general is calling on Meta officials in Brazil to discuss their hate speech policies, the spread of criminal content, the sustainability of professional fact checking programs, and the impacts of content moderation on historically marginalized groups.
Last week, Meta responded in writing to the Attorney General’s Office. The company said the decision to end fact checking partnerships applies only to the United States; that it will test a "community notes" system in the U.S. to replace fact-checking; that Meta is "committed to respecting human rights" and "freedom of expression, a fundamental human right that enables the exercise of many other rights"; and that it continues to prioritize user safety and privacy and takes its role in preventing abuse seriously,..
Fact-checking alone is not enough to eradicate misinformation, but it is important to inhibit it, Olivia Sohr emphasized. “If we manage to fact-check at the right moment, fewer people share misinformation,” she said.
Sohr added that she does not believe community notes are an effective resource for combating false and misleading information. “The results we’ve seen on X so far are very negative,” she said.
Professional fact-checking follows established criteria, such as transparency, editorial policies, and objective standards. Meta had long been supportive of the program, Natália Leal said.
“It’s very troubling when criticism happens publicly without a chance to defend ourselves,” Leal said. “Meta had always stood by the program—Mark Zuckerberg himself defended it before the U.S. Congress. Since 2018, we’ve never received negative feedback from Meta, and now the program is being attacked as something harmful.”