Chrome Extension
WeChat Mini Program
Use on ChatGLM

On the Impact of Showing Evidence from Peers in Crowdsourced Truthfulness Assessments.

ACM Trans. Inf. Syst.(2024)

Cited 0|Views24
No score
Abstract
Misinformation has been rapidly spreading online. The common approach to dealing with it is deploying expert fact-checkers who follow forensic processes to identify the veracity of statements. Unfortunately, such an approach does not scale well. To deal with this, crowdsourcing has been looked at as an opportunity to complement the work done by trained journalists. In this article, we look at the effect of presenting the crowd with evidence from others while judging the veracity of statements. We implement variants of the judgment task design to understand whether and how the presented evidence may or may not affect the way crowd workers judge truthfulness and their performance. Our results show that, in certain cases, the presented evidence and the way in which it is presented may mislead crowd workers who would otherwise be more accurate if judging independently from others. Those who make appropriate use of the provided evidence, however, can benefit from it and generate better judgments.
More
Translated text
Key words
Misinformation,crowdsourcing,metadata,information credibility
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined