ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2407.05664
  4. Cited By
How DNNs break the Curse of Dimensionality: Compositionality and Symmetry Learning

How DNNs break the Curse of Dimensionality: Compositionality and Symmetry Learning

8 July 2024
Arthur Jacot
Seok Hoan Choi
Yuxiao Wen
    AI4CE
ArXivPDFHTML

Papers citing "How DNNs break the Curse of Dimensionality: Compositionality and Symmetry Learning"

6 / 6 papers shown
Title
Shallow diffusion networks provably learn hidden low-dimensional
  structure
Shallow diffusion networks provably learn hidden low-dimensional structure
Nicholas M. Boffi
Arthur Jacot
Stephen Tu
Ingvar M. Ziemann
DiffM
29
1
0
15 Oct 2024
Which Frequencies do CNNs Need? Emergent Bottleneck Structure in Feature Learning
Which Frequencies do CNNs Need? Emergent Bottleneck Structure in Feature Learning
Yuxiao Wen
Arthur Jacot
45
6
0
12 Feb 2024
Implicit Bias of Large Depth Networks: a Notion of Rank for Nonlinear
  Functions
Implicit Bias of Large Depth Networks: a Notion of Rank for Nonlinear Functions
Arthur Jacot
34
24
0
29 Sep 2022
On the inability of Gaussian process regression to optimally learn
  compositional functions
On the inability of Gaussian process regression to optimally learn compositional functions
M. Giordano
Kolyan Ray
Johannes Schmidt-Hieber
31
12
0
16 May 2022
The Role of Linear Layers in Nonlinear Interpolating Networks
The Role of Linear Layers in Nonlinear Interpolating Networks
Greg Ongie
Rebecca Willett
46
15
0
02 Feb 2022
Norm-Based Capacity Control in Neural Networks
Norm-Based Capacity Control in Neural Networks
Behnam Neyshabur
Ryota Tomioka
Nathan Srebro
111
577
0
27 Feb 2015
1