ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2003.04180
  4. Cited By
Approximate is Good Enough: Probabilistic Variants of Dimensional and
  Margin Complexity

Approximate is Good Enough: Probabilistic Variants of Dimensional and Margin Complexity

9 March 2020
Pritish Kamath
Omar Montasser
Nathan Srebro
ArXivPDFHTML

Papers citing "Approximate is Good Enough: Probabilistic Variants of Dimensional and Margin Complexity"

8 / 8 papers shown
Title
The Power of Random Features and the Limits of Distribution-Free Gradient Descent
The Power of Random Features and the Limits of Distribution-Free Gradient Descent
Ari Karchmer
Eran Malach
28
0
0
15 May 2025
Pareto Frontiers in Neural Feature Learning: Data, Compute, Width, and
  Luck
Pareto Frontiers in Neural Feature Learning: Data, Compute, Width, and Luck
Benjamin L. Edelman
Surbhi Goel
Sham Kakade
Eran Malach
Cyril Zhang
50
8
0
07 Sep 2023
Hidden Progress in Deep Learning: SGD Learns Parities Near the
  Computational Limit
Hidden Progress in Deep Learning: SGD Learns Parities Near the Computational Limit
Boaz Barak
Benjamin L. Edelman
Surbhi Goel
Sham Kakade
Eran Malach
Cyril Zhang
44
124
0
18 Jul 2022
Random Feature Amplification: Feature Learning and Generalization in
  Neural Networks
Random Feature Amplification: Feature Learning and Generalization in Neural Networks
Spencer Frei
Niladri S. Chatterji
Peter L. Bartlett
MLT
35
29
0
15 Feb 2022
Quantum machine learning beyond kernel methods
Quantum machine learning beyond kernel methods
Sofiene Jerbi
Lukas J. Fiderer
Hendrik Poulsen Nautrup
Jonas M. Kubler
H. Briegel
Vedran Dunjko
19
162
0
25 Oct 2021
Reconstruction on Trees and Low-Degree Polynomials
Reconstruction on Trees and Low-Degree Polynomials
Frederic Koehler
Elchanan Mossel
35
9
0
14 Sep 2021
The Connection Between Approximation, Depth Separation and Learnability
  in Neural Networks
The Connection Between Approximation, Depth Separation and Learnability in Neural Networks
Eran Malach
Gilad Yehudai
Shai Shalev-Shwartz
Ohad Shamir
26
20
0
31 Jan 2021
A case where a spindly two-layer linear network whips any neural network
  with a fully connected input layer
A case where a spindly two-layer linear network whips any neural network with a fully connected input layer
Manfred K. Warmuth
W. Kotłowski
Ehsan Amid
MLT
36
1
0
16 Oct 2020
1