The 2024 Foundation Model Transparency Index

Foundation models are increasingly consequential yet extremely opaque. To characterize the status quo, the Foundation Model Transparency Index (FMTI) was launched in October 2023 to measure the transparency of leading foundation model developers. FMTI 2023 assessed 10 major foundation model developers (e.g. OpenAI, Google) on 100 transparency indicators (e.g. does the developer disclose the wages it pays for data labor?). At the time, developers publicly disclosed very limited information with the average score being 37 out of 100. To understand how the status quo has changed, we conduct a follow-up study after 6 months: we score 14 developers against the same 100 indicators. While in FMTI 2023 we searched for publicly available information, in FMTI 2024 developers submit reports on the 100 transparency indicators, potentially including information that was not previously public. We find that developers now score 58 out of 100 on average, a 21 point improvement over FMTI 2023. Much of this increase is driven by developers disclosing information during the FMTI 2024 process: on average, developers disclosed information related to 16.6 indicators that was not previously public. We observe regions of sustained (i.e. across 2023 and 2024) and systemic (i.e. across most or all developers) opacity such as on copyright status, data access, data labor, and downstream impact. We publish transparency reports for each developer that consolidate information disclosures: these reports are based on the information disclosed to us via developers. Our findings demonstrate that transparency can be improved in this nascent ecosystem, the Foundation Model Transparency Index likely contributes to these improvements, and policymakers should consider interventions in areas where transparency has not improved.
View on arXiv@article{bommasani2025_2407.12929, title={ The 2024 Foundation Model Transparency Index }, author={ Rishi Bommasani and Kevin Klyman and Sayash Kapoor and Shayne Longpre and Betty Xiong and Nestor Maslej and Percy Liang }, journal={arXiv preprint arXiv:2407.12929}, year={ 2025 } }