ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1910.11508
  4. Cited By
Over Parameterized Two-level Neural Networks Can Learn Near Optimal
  Feature Representations

Over Parameterized Two-level Neural Networks Can Learn Near Optimal Feature Representations

25 October 2019
Cong Fang
Hanze Dong
Tong Zhang
ArXivPDFHTML

Papers citing "Over Parameterized Two-level Neural Networks Can Learn Near Optimal Feature Representations"

2 / 2 papers shown
Title
Generalisation Guarantees for Continual Learning with Orthogonal
  Gradient Descent
Generalisation Guarantees for Continual Learning with Orthogonal Gradient Descent
Mehdi Abbana Bennani
Thang Doan
Masashi Sugiyama
CLL
50
61
0
21 Jun 2020
Can Temporal-Difference and Q-Learning Learn Representation? A
  Mean-Field Theory
Can Temporal-Difference and Q-Learning Learn Representation? A Mean-Field Theory
Yufeng Zhang
Qi Cai
Zhuoran Yang
Yongxin Chen
Zhaoran Wang
OOD
MLT
105
11
0
08 Jun 2020
1