Meta, the parent company of Facebook and Instagram, is shifting its approach to combat misinformation by relying on community notes, a crowd-sourced fact-checking system. This system enlists nearly a million unpaid contributors who flag and correct misinformation, similar to how Wikipedia operates. Despite criticism, Meta aims to increase the speed and volume of fact-checks while maintaining political neutrality. This transition marks a significant departure from traditional fact-checkers, sparking debate on its efficacy and implications for content moderation.
Community notes empower users to identify and rectify false information on Meta's platforms. Research indicates that while the system produces hundreds of fact checks daily, over 90% of proposed notes go unused. However, a study found that 98% of community notes related to Covid-19 were accurate, suggesting that this method can deliver high-quality fact-checks. These notes have demonstrated the ability to significantly reduce the viral spread of misleading posts, with platforms like X reporting that appending a note can cut its spread by more than half and increase the likelihood of the original poster deleting the tweet by 80%.
Meta's decision to embrace community notes comes alongside relaxed rules on politically sensitive topics such as gender and immigration. As Keith Coleman points out, Meta is not ending its fact-checking program but rather evolving it into a more capable system. Mark Zuckerberg has faced criticism for this change, with some accusing him of pandering to political pressures.
"Mark Zuckerberg was clearly pandering to the incoming administration and to Elon Musk." – Alexios Mantzarlis
Despite these criticisms, Meta maintains that its new approach will yield more fact checks and faster responses. Alexios Mantzarlis suggests that this shift could be a strategy for Meta to scale its content moderation efforts.
"Meta replaces existing fact checking program with approach that can scale to cover more content, respond faster and is trusted across the political spectrum" – Mark Zuckerberg
The community notes system is not without its challenges. Over 90% of proposed notes are never used, posing questions about their overall effectiveness. However, research supports their potential in providing quality fact checks. The system's success partially hinges on the contributors' ability to remain unbiased, a task that professional fact-checkers might handle differently.
"Crowd-sourcing can be a useful component of [an] information moderation system, but it should not be the only component." – Professor Tom Stafford
Professional fact-checkers have traditionally played a crucial role in identifying dangerous misinformation and emerging harmful narratives. Baybars Orsek argues that their expertise is indispensable in tackling misinformation effectively. Meanwhile, Mark Zuckerberg contends that traditional fact-checkers have been politically biased, particularly in the United States, eroding trust instead of building it.
"Fact checkers have just been too politically biased and have destroyed more trust than they've created, especially in the US" – Mark Zuckerberg
Meta's decision to move away from professional fact-checkers has sparked debate over the company's motives and its potential impact on content moderation. Some view this as a cost-cutting measure or a response to external pressures.
"Fact checkers started becoming arbiters of truth in a substantial way that upset politically-motivated partisans and people in power and suddenly, weaponised attacks were on them" – Alexios Mantzarlis
Despite these concerns, Meta will continue employing thousands of moderators responsible for removing millions of pieces of harmful content daily. This includes graphic violence and material related to child sexual exploitation. The company's decision reflects a broader trend towards community-driven moderation systems, inspired by platforms like X's Birdwatch.
"Birdwatch", as it was then known, began in 2021 and drew inspiration from Wikipedia, which is written and edited by volunteers. – X
The transition to community notes represents a significant shift in how Meta addresses misinformation. While challenges remain, especially concerning bias and unused contributions, the system has shown promise in delivering accurate fact-checks and curbing misinformation spread.