ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2305.13141
  4. Cited By
Tight conditions for when the NTK approximation is valid

Tight conditions for when the NTK approximation is valid

22 May 2023
Enric Boix-Adserà
Etai Littwin
ArXivPDFHTML

Papers citing "Tight conditions for when the NTK approximation is valid"

5 / 5 papers shown
Title
SGD learning on neural networks: leap complexity and saddle-to-saddle
  dynamics
SGD learning on neural networks: leap complexity and saddle-to-saddle dynamics
Emmanuel Abbe
Enric Boix-Adserà
Theodor Misiakiewicz
FedML
MLT
76
72
0
21 Feb 2023
Learning Single-Index Models with Shallow Neural Networks
Learning Single-Index Models with Shallow Neural Networks
A. Bietti
Joan Bruna
Clayton Sanford
M. Song
160
65
0
27 Oct 2022
Neural Networks Efficiently Learn Low-Dimensional Representations with
  SGD
Neural Networks Efficiently Learn Low-Dimensional Representations with SGD
Alireza Mousavi-Hosseini
Sejun Park
M. Girotti
Ioannis Mitliagkas
Murat A. Erdogdu
MLT
319
48
0
29 Sep 2022
Learning with invariances in random features and kernel models
Learning with invariances in random features and kernel models
Song Mei
Theodor Misiakiewicz
Andrea Montanari
OOD
44
89
0
25 Feb 2021
On the eigenvector bias of Fourier feature networks: From regression to
  solving multi-scale PDEs with physics-informed neural networks
On the eigenvector bias of Fourier feature networks: From regression to solving multi-scale PDEs with physics-informed neural networks
Sifan Wang
Hanwen Wang
P. Perdikaris
126
435
0
18 Dec 2020
1