Rates of Fisher information convergence in the central limit theorem for
nonlinear statistics
Probability theory and related fields (PTRF), 2022
Abstract
We develop a general method to study the Fisher information distance in central limit theorem for nonlinear statistics. We first construct completely new representations for the score function. We then use these representations to derive quantitative estimates for the Fisher information distance. To illustrate the applicability of our approach, explicit rates of Fisher information convergence for quadratic forms and the functions of sample means are provided. For the sums of independent random variables, we obtain the Fisher information bounds without requiring the finiteness of Poincar\é constant. Our method can also be used to bound the Fisher information distance in non-central limit theorems.
View on arXivComments on this paper
