A Unified Theory of Random Projection for Influence Functions
Pingbang Hu
Yuzheng Hu
Jiaqi W. Ma
Han Zhao
- TDI
Main:13 Pages
4 Figures
Bibliography:3 Pages
Appendix:30 Pages
Abstract
Influence functions and related data attribution scores take the form of , where is a curvature operator. In modern overparameterized models, forming or inverting is prohibitive, motivating scalable influence computation via random projection with a sketch . This practice is commonly justified via the Johnson--Lindenstrauss (JL) lemma, which ensures approximate preservation of Euclidean geometry for a fixed dataset. However, JL does not address how sketching behaves under inversion. Furthermore, there is no existing theory that explains how sketching interacts with other widely-used techniques, such as ridge regularization and structured curvature approximations.
View on arXivComments on this paper
