In this paper, we present our journey in exploring the use of crowdsourcing for fact-checking. We discuss our early experiments aimed towards the identification of the best possible setting for misinformation assessment using crowdsourcing. Our results indicate that the crowd can effectively address misinformation at scale, showing some degree of correlation with experts. We also highlight the influence of worker background on the quality of truthfulness assessments.

Fact-Checking at Scale with Crowdsourcing: Experiments and Lessons Learned

David La Barbera
Primo
;
Michael Soprano
Secondo
;
Kevin Roitero
;
Eddy Maddalena
Penultimo
;
Stefano Mizzaro
Ultimo
2023-01-01

Abstract

In this paper, we present our journey in exploring the use of crowdsourcing for fact-checking. We discuss our early experiments aimed towards the identification of the best possible setting for misinformation assessment using crowdsourcing. Our results indicate that the crowd can effectively address misinformation at scale, showing some degree of correlation with experts. We also highlight the influence of worker background on the quality of truthfulness assessments.
File in questo prodotto:
File Dimensione Formato  
Paper.pdf

accesso aperto

Descrizione: Versione pubblicata
Tipologia: Versione Editoriale (PDF)
Licenza: Creative commons
Dimensione 811.93 kB
Formato Adobe PDF
811.93 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11390/1257564
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact