ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2108.04190
  4. Cited By
On the Power of Differentiable Learning versus PAC and SQ Learning

On the Power of Differentiable Learning versus PAC and SQ Learning

9 August 2021
Emmanuel Abbe
Pritish Kamath
Eran Malach
Colin Sandon
Nathan Srebro
    MLT
ArXivPDFHTML

Papers citing "On the Power of Differentiable Learning versus PAC and SQ Learning"

2 / 2 papers shown
Title
Hidden Progress in Deep Learning: SGD Learns Parities Near the
  Computational Limit
Hidden Progress in Deep Learning: SGD Learns Parities Near the Computational Limit
Boaz Barak
Benjamin L. Edelman
Surbhi Goel
Sham Kakade
Eran Malach
Cyril Zhang
25
122
0
18 Jul 2022
An initial alignment between neural network and target is needed for
  gradient descent to learn
An initial alignment between neural network and target is needed for gradient descent to learn
Emmanuel Abbe
Elisabetta Cornacchia
Jan Hązła
Christopher Marquis
9
16
0
25 Feb 2022
1