Efficient Estimation of Linear Functionals of Principal Components

We study principal component analysis (PCA) for mean zero i.i.d. Gaussian observations in a separable Hilbert space with unknown covariance operator The complexity of the problem is characterized by its effective rank where denotes the trace of and denotes its operator norm. This framework includes, in particular, high-dimensional spiked covariance models as well as some models in functional PCA and kernel PCA in machine learning. We develop a method of bias reduction in the problem of estimation of linear functionals of eigenvectors of Under the assumption that we establish the asymptotic normality and asymptotic properties of the risk of the resulting estimators and prove matching minimax lower bounds, showing their semi-parametric optimality.
View on arXiv