ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2312.15469
11
9

Efficient Estimation of the Central Mean Subspace via Smoothed Gradient Outer Products

24 December 2023
Gan Yuan
Mingyue Xu
Samory Kpotufe
Daniel Hsu
ArXivPDFHTML
Abstract

We consider the problem of sufficient dimension reduction (SDR) for multi-index models. The estimators of the central mean subspace in prior works either have slow (non-parametric) convergence rates, or rely on stringent distributional conditions (e.g., the covariate distribution PXP_{\mathbf{X}}PX​ being elliptical symmetric). In this paper, we show that a fast parametric convergence rate of form Cd⋅n−1/2C_d \cdot n^{-1/2}Cd​⋅n−1/2 is achievable via estimating the \emph{expected smoothed gradient outer product}, for a general class of distribution PXP_{\mathbf{X}}PX​ admitting Gaussian or heavier distributions. When the link function is a polynomial with a degree of at most rrr and PXP_{\mathbf{X}}PX​ is the standard Gaussian, we show that the prefactor depends on the ambient dimension ddd as Cd∝drC_d \propto d^rCd​∝dr.

View on arXiv
Comments on this paper