62
4

Show Me the Work: Fact-Checkers' Requirements for Explainable Automated Fact-Checking

Abstract

The pervasiveness of large language models and generative AI in online media has amplified the need for effective automated fact-checking to assist fact-checkers in tackling the increasing volume and sophistication of misinformation. The complex nature of fact-checking demands that automated fact-checking systems provide explanations that enable fact-checkers to scrutinise their outputs. However, it is unclear how these explanations should align with the decision-making and reasoning processes of fact-checkers to be effectively integrated into their workflows. Through semi-structured interviews with fact-checking professionals, we bridge this gap by: (i) providing an account of how fact-checkers assess evidence, make decisions, and explain their processes; (ii) examining how fact-checkers use automated tools in practice; and (iii) identifying fact-checker explanation requirements for automated fact-checking tools. The findings show unmet explanation needs and identify important criteria for replicable fact-checking explanations that trace the model's reasoning path, reference specific evidence, and highlight uncertainty and information gaps.

View on arXiv
@article{warren2025_2502.09083,
  title={ Show Me the Work: Fact-Checkers' Requirements for Explainable Automated Fact-Checking },
  author={ Greta Warren and Irina Shklovski and Isabelle Augenstein },
  journal={arXiv preprint arXiv:2502.09083},
  year={ 2025 }
}
Comments on this paper