ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2002.10008
48
4
v1v2v3 (latest)

Conditional regression for single-index models

23 February 2020
A. Lanteri
Mauro Maggioni
Stefano Vigogna
ArXiv (abs)PDFHTML
Abstract

The single-index model is a statistical model for intrinsic regression where responses are assumed to depend on a single yet unknown linear combination of the predictors, allowing to express the regression function as E[Y∣X]=f(⟨v,X⟩) \mathbb{E} [ Y | X ] = f ( \langle v , X \rangle ) E[Y∣X]=f(⟨v,X⟩) for some unknown \emph{index} vector vvv and \emph{link} function fff. Conditional methods provide a simple and effective approach to estimate vvv by averaging moments of XXX conditioned on YYY, but depend on parameters whose optimal choice is unknown and do not provide generalization bounds on fff. In this paper we propose a new conditional method converging at n\sqrt{n}n​ rate under an explicit parameter characterization. Moreover, we prove that polynomial partitioning estimates achieve the 111-dimensional min-max rate for regression of H\"older functions when combined to any n\sqrt{n}n​-convergent index estimator. Overall this yields an estimator for dimension reduction and regression of single-index models that attains statistical optimality in quasilinear time.

View on arXiv
Comments on this paper