ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2002.04026
  4. Cited By
A Generalized Neural Tangent Kernel Analysis for Two-layer Neural
  Networks
v1v2 (latest)

A Generalized Neural Tangent Kernel Analysis for Two-layer Neural Networks

10 February 2020
Zixiang Chen
Yuan Cao
Quanquan Gu
Tong Zhang
    MLT
ArXiv (abs)PDFHTML

Papers citing "A Generalized Neural Tangent Kernel Analysis for Two-layer Neural Networks"

7 / 7 papers shown
An Exact Kernel Equivalence for Finite Classification Models
An Exact Kernel Equivalence for Finite Classification Models
Brian Bell
Michaela Geyer
David Glickenstein
Amanda Fernandez
Juston Moore
356
4
0
01 Aug 2023
On the generalization of learning algorithms that do not converge
On the generalization of learning algorithms that do not convergeNeural Information Processing Systems (NeurIPS), 2022
N. Chandramoorthy
Andreas Loukas
Khashayar Gatmiry
Stefanie Jegelka
MLT
417
12
0
16 Aug 2022
Wasserstein Flow Meets Replicator Dynamics: A Mean-Field Analysis of
  Representation Learning in Actor-Critic
Wasserstein Flow Meets Replicator Dynamics: A Mean-Field Analysis of Representation Learning in Actor-CriticNeural Information Processing Systems (NeurIPS), 2021
Yufeng Zhang
Siyu Chen
Zhuoran Yang
Sai Li
Zhaoran Wang
333
6
0
27 Dec 2021
One-pass Stochastic Gradient Descent in Overparametrized Two-layer
  Neural Networks
One-pass Stochastic Gradient Descent in Overparametrized Two-layer Neural NetworksInternational Conference on Artificial Intelligence and Statistics (AISTATS), 2021
Hanjing Zhu
Hanjing Zhu
MLT
191
3
0
01 May 2021
Can Temporal-Difference and Q-Learning Learn Representation? A
  Mean-Field Theory
Can Temporal-Difference and Q-Learning Learn Representation? A Mean-Field Theory
Yufeng Zhang
Qi Cai
Zhuoran Yang
Yongxin Chen
Zhaoran Wang
OODMLT
766
11
0
08 Jun 2020
Predicting the outputs of finite deep neural networks trained with noisy
  gradients
Predicting the outputs of finite deep neural networks trained with noisy gradientsPhysical Review E (PRE), 2020
Gadi Naveh
Oded Ben-David
H. Sompolinsky
Zohar Ringel
511
32
0
02 Apr 2020
Landscape Connectivity and Dropout Stability of SGD Solutions for
  Over-parameterized Neural Networks
Landscape Connectivity and Dropout Stability of SGD Solutions for Over-parameterized Neural NetworksInternational Conference on Machine Learning (ICML), 2019
Aleksandr Shevchenko
Marco Mondelli
486
41
0
20 Dec 2019
1
Page 1 of 1