ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1305.6916
359
39
v1v2v3v4 (latest)

Optimal Rates of Convergence of Transelliptical Component Analysis

29 May 2013
Fang Han
Han Liu
ArXiv (abs)PDFHTML
Abstract

Han and Liu (2012) proposed a method named transelliptical component analysis (TCA) for conducting scale-invariant principal component analysis on high dimensional data with transelliptical distributions. The transelliptical family assumes that the data follow an elliptical distribution after unspecified marginal monotone transformations. In a double asymptotic framework where the dimension ddd is allowed to increase with the sample size nnn, Han and Liu (2012) showed that one version of TCA attains a "nearly parametric" rate of convergence in parameter estimation when the parameter of interest is assumed to be sparse. This paper improves upon their results in two aspects: (i) Under the non-sparse setting (i.e., the parameter of interest is not assumed to be sparse), we show that a version of TCA attains the optimal rate of convergence up to a logarithmic factor; (ii) Under the sparse setting, we also lay out venues to analyze the performance of the TCA estimator proposed in Han and Liu (2012). In particular, we provide a "sign subgaussian condition" which is sufficient for TCA to attain an improved rate of convergence and verify a subfamily of the transelliptical distributions satisfying this condition.

View on arXiv
Comments on this paper