ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2301.00260
11
1

Confidence Sets under Generalized Self-Concordance

31 December 2022
Lang Liu
Zaïd Harchaoui
ArXivPDFHTML
Abstract

This paper revisits a fundamental problem in statistical inference from a non-asymptotic theoretical viewpoint \unicodex2013\unicode{x2013}\unicodex2013 the construction of confidence sets. We establish a finite-sample bound for the estimator, characterizing its asymptotic behavior in a non-asymptotic fashion. An important feature of our bound is that its dimension dependency is captured by the effective dimension \unicodex2013\unicode{x2013}\unicodex2013 the trace of the limiting sandwich covariance \unicodex2013\unicode{x2013}\unicodex2013 which can be much smaller than the parameter dimension in some regimes. We then illustrate how the bound can be used to obtain a confidence set whose shape is adapted to the optimization landscape induced by the loss function. Unlike previous works that rely heavily on the strong convexity of the loss function, we only assume the Hessian is lower bounded at optimum and allow it to gradually becomes degenerate. This property is formalized by the notion of generalized self-concordance which originated from convex optimization. Moreover, we demonstrate how the effective dimension can be estimated from data and characterize its estimation accuracy. We apply our results to maximum likelihood estimation with generalized linear models, score matching with exponential families, and hypothesis testing with Rao's score test.

View on arXiv
Comments on this paper