Citation

Scaling Up Fact-Checking Using the Wisdom of Crowds

Author:
Allen, Jennifer Nancy Lee; Arechar, Antonio Alonso; Pennycook, Gordon; Rand, David
Year:
2020

Misinformation on social media has become a major focus of research and concern in recent years. Perhaps the most prominent approach to combating misinformation is the use of professional fact-checkers. This approach, however, is not scalable: Professional fact-checkers cannot possibly keep up with the volume of misinformation produced every day. Furthermore, many people see fact-checkers as having a liberal bias and thus distrust them. Here, we explore a potential solution to both of these problems: leveraging the “wisdom of crowds” to identify misinformation at scale using politically-balanced groups of laypeople. Using a set of 207 news articles flagged for fact-checking by an internal Facebook algorithm, we compare the accuracy ratings given by (i) three professional fact-checkers after researching each article and (ii) 1,128 Americans from Amazon Mechanical Turk after simply reading the headline and lede sentence. We find that the average rating of a politically-balanced crowd of 10 laypeople is as correlated with the average fact-checker rating as the fact-checkers’ ratings are correlated with each other. Furthermore, the layperson ratings can predict whether the majority of fact-checkers rated a headline as “true” with high accuracy, particularly for headlines where all three fact-checkers agree. We also find that layperson cognitive reflection, political knowledge, and Democratic Party preference are positively related to agreement with fact-checker ratings; and that informing laypeople of each headline’s publisher leads to a small increase in agreement with fact-checkers. Our results indicate that crowdsourcing is a promising approach for helping to identify misinformation at scale.