ESGBench: A Benchmark for Explainable ESG Question Answering in Corporate Sustainability Reports
Main:2 Pages
1 Figures
Bibliography:1 Pages
5 Tables
Abstract
We present ESGBench, a benchmark dataset and evaluation framework designed to assess explainable ESG question answering systems using corporate sustainability reports. The benchmark consists of domain-grounded questions across multiple ESG themes, paired with human-curated answers and supporting evidence to enable fine-grained evaluation of model reasoning. We analyze the performance of state-of-the-art LLMs on ESGBench, highlighting key challenges in factual consistency, traceability, and domain alignment. ESGBench aims to accelerate research in transparent and accountable ESG-focused AI systems.
View on arXivComments on this paper
