ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2204.02678
16
2

Random Features Model with General Convex Regularization: A Fine Grained Analysis with Precise Asymptotic Learning Curves

6 April 2022
David Bosch
Ashkan Panahi
Ayça Özçelikkale
Devdatt Dubhash
    MLT
ArXivPDFHTML
Abstract

We compute precise asymptotic expressions for the learning curves of least squares random feature (RF) models with either a separable strongly convex regularization or the ℓ1\ell_1ℓ1​ regularization. We propose a novel multi-level application of the convex Gaussian min max theorem (CGMT) to overcome the traditional difficulty of finding computable expressions for random features models with correlated data. Our result takes the form of a computable 4-dimensional scalar optimization. In contrast to previous results, our approach does not require solving an often intractable proximal operator, which scales with the number of model parameters. Furthermore, we extend the universality results for the training and generalization errors for RF models to ℓ1\ell_1ℓ1​ regularization. In particular, we demonstrate that under mild conditions, random feature models with elastic net or ℓ1\ell_1ℓ1​ regularization are asymptotically equivalent to a surrogate Gaussian model with the same first and second moments. We numerically demonstrate the predictive capacity of our results, and show experimentally that the predicted test error is accurate even in the non-asymptotic regime.

View on arXiv
Comments on this paper