ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.04520
16
0

Hessian of Perplexity for Large Language Models by PyTorch autograd (Open Source)

6 April 2025
Ivan Ilin
ArXivPDFHTML
Abstract

Computing the full Hessian matrix -- the matrix of second-order derivatives for an entire Large Language Model (LLM) is infeasible due to its sheer size. In this technical report, we aim to provide a comprehensive guide on how to accurately compute at least a small portion of the Hessian for LLMs using PyTorch autograd library. We also demonstrate how to compute the full diagonal of the Hessian matrix using multiple samples of vector-Hessian Products (HVPs). We hope that both this guide and the accompanying GitHub code will be valuable resources for practitioners and researchers interested in better understanding the behavior and structure of the Hessian in LLMs.

View on arXiv
@article{ilin2025_2504.04520,
  title={ Hessian of Perplexity for Large Language Models by PyTorch autograd (Open Source) },
  author={ Ivan Ilin },
  journal={arXiv preprint arXiv:2504.04520},
  year={ 2025 }
}
Comments on this paper