Crowdsourced fact-checking fights misinformation in Taiwan

Credit: CC0 Public domain

A new study from Cornell University finds that while professional journalists and fact-checkers struggle to keep up with the deluge of misinformation online, sites that rely on loosely coordinated volunteer contributions, like Wikipedia, can help fill the gaps.

In the study, researchers compared professional fact-checking articles to posts on Cofacts, a community fact-checking platform in Taiwan. They found that the crowdsourced site often responded to queries faster than professionals and handled a different range of issues across platforms.

“Fact-checking is an essential part of being able to use our information ecosystem in a way that supports reliable information,” said lead author Mor Naaman, professor of information science. “Places of knowledge production, like Wikipedia and Cofacts, have proven to be the most resilient to disinformation campaigns so far.”

Andy Zhao, a doctoral student in information science, used natural language processing to match answers posted on Cofacts with articles addressing the same questions on two professional fact-checking sites. It looked at how quickly sites published responses to queries, the accuracy and persuasiveness of responses, and the range of topics covered.

He found that Cofacts users often responded more quickly than journalists, but mostly because they could “stand on the shoulders of giants” and reuse existing professionally written articles. Cofacts thus acts as an information distributor.

Importantly, Zhao found that Cofacts publications were just as accurate as professional sources. According to seven Taiwanese-born graduate students who served as evaluators, the journalists’ stories were more persuasive, but those from Cofacts were often clearer.

Further analysis showed that the participatory site covered a slightly different range of topics than those covered by professionals. Articles on Cofacts were more likely to address recent, local issues, such as regional politics and small scams, while journalists were more likely to write about topics requiring expertise, including health claims and international affairs.

“We can harness the power of crowds to counter misinformation,” Zhao concluded. “Misinformation is coming from everywhere, and we need this battle to be fought everywhere.”

Despite Cofacts’ success in Taiwan, Zhao and Naaman caution that the same approach may not be applicable to other countries. “Cofacts was built on Taiwan’s user habits, cultures, context, and political and social structures, and that’s how they succeeded,” Zhao said.

However, understanding the success of Cofacts can help in the design of other fact-checking systems, especially in regions that do not speak English and have access to few, if any, fact-checking resources.

The results are published in the Journal of Online Trust and Safety.

More information:
Andy Zhao et al, Insights from a Comparative Study on the Variety, Velocity, Veracity, and Viability of Crowdsourced and Professional Fact-Checking Services, Journal of Online Trust and Safety (2023). DOI: 10.54501/jots.v2i1.118

Provided by Cornell University

Quote: Crowdsourced fact-checking combats misinformation in Taiwan (November 21, 2023) retrieved November 21, 2023 from

This document is submitted to . Except for fair use for private study or research purposes, no part may be reproduced without written permission. The content is provided for information only.

You May Also Like

+ There are no comments

Add yours