ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1208.3030
67
25

Asymptotic Generalization Bound of Fisher's Linear Discriminant Analysis

15 August 2012
Wei Bian
Dacheng Tao
    AI4CE
ArXivPDFHTML
Abstract

Fisher's linear discriminant analysis (FLDA) is an important dimension reduction method in statistical pattern recognition. It has been shown that FLDA is asymptotically Bayes optimal under the homoscedastic Gaussian assumption. However, this classical result has the following two major limitations: 1) it holds only for a fixed dimensionality DDD, and thus does not apply when DDD and the training sample size NNN are proportionally large; 2) it does not provide a quantitative description on how the generalization ability of FLDA is affected by DDD and NNN. In this paper, we present an asymptotic generalization analysis of FLDA based on random matrix theory, in a setting where both DDD and NNN increase and D/N⟶γ∈[0,1)D/N\longrightarrow\gamma\in[0,1)D/N⟶γ∈[0,1). The obtained lower bound of the generalization discrimination power overcomes both limitations of the classical result, i.e., it is applicable when DDD and NNN are proportionally large and provides a quantitative description of the generalization ability of FLDA in terms of the ratio γ=D/N\gamma=D/Nγ=D/N and the population discrimination power. Besides, the discrimination power bound also leads to an upper bound on the generalization error of binary-classification with FLDA.

View on arXiv
Comments on this paper