227

Asymptotic Generalization Bound of Fisher's Linear Discriminant Analysis

IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI), 2012
Abstract

Fisher's linear discriminant analysis (FLDA) is an important dimension reduction method in statistical pattern recognition. It has been shown that FLDA is asymptotically Bayes optimal under the homoscedastic Gaussian assumption. However, this classical result has the following two major limitations: 1) it holds only for a fixed dimensionality DD, and thus does not apply when DD and the training sample number NN are proportionally large; 2) it does not provide a quantitative description on the performance of FLDA. In this paper, we present an asymptotic generalization analysis of FLDA based on random matrix theory in the setting where both DD and NN increase and limD/N=γ[0,1)\lim D/N=\gamma\in[0,1). The obtained asymptotic generalization bound overcomes both limitations of the classical result, i.e., it is applicable when DD and NN are proportionally large and provides a quantitative description of the generalization ability of FLDA in terms of the ratio D/ND/N and the population discrimination power.

View on arXiv
Comments on this paper