ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.12509
2
0

Towards Budget-Friendly Model-Agnostic Explanation Generation for Large Language Models

18 May 2025
Junhao Liu
Haonan Yu
Xin Zhang
    LRM
ArXivPDFHTML
Abstract

With Large language models (LLMs) becoming increasingly prevalent in various applications, the need for interpreting their predictions has become a critical challenge. As LLMs vary in architecture and some are closed-sourced, model-agnostic techniques show great promise without requiring access to the model's internal parameters. However, existing model-agnostic techniques need to invoke LLMs many times to gain sufficient samples for generating faithful explanations, which leads to high economic costs. In this paper, we show that it is practical to generate faithful explanations for large-scale LLMs by sampling from some budget-friendly models through a series of empirical studies. Moreover, we show that such proxy explanations also perform well on downstream tasks. Our analysis provides a new paradigm of model-agnostic explanation methods for LLMs, by including information from budget-friendly models.

View on arXiv
@article{liu2025_2505.12509,
  title={ Towards Budget-Friendly Model-Agnostic Explanation Generation for Large Language Models },
  author={ Junhao Liu and Haonan Yu and Xin Zhang },
  journal={arXiv preprint arXiv:2505.12509},
  year={ 2025 }
}
Comments on this paper