ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2206.03299
  4. Cited By
Generalization Error Bounds for Deep Neural Networks Trained by SGD
v1v2 (latest)

Generalization Error Bounds for Deep Neural Networks Trained by SGD

7 June 2022
Mingze Wang
Chao Ma
ArXiv (abs)PDFHTML

Papers citing "Generalization Error Bounds for Deep Neural Networks Trained by SGD"

10 / 10 papers shown
Title
Learning Guarantee of Reward Modeling Using Deep Neural Networks
Learning Guarantee of Reward Modeling Using Deep Neural Networks
Yuanhang Luo
Yeheng Ge
Ruijian Han
Guohao Shen
71
0
0
10 May 2025
Statistically guided deep learning
Statistically guided deep learning
Michael Kohler
A. Krzyżak
ODLBDL
174
0
0
11 Apr 2025
Generalizability of Neural Networks Minimizing Empirical Risk Based on Expressive Ability
Lijia Yu
Yibo Miao
Yifan Zhu
Xiao-Shan Gao
Lijun Zhang
88
0
0
06 Mar 2025
A Near Complete Nonasymptotic Generalization Theory For Multilayer Neural Networks: Beyond the Bias-Variance Tradeoff
Hao Yu
Xiangyang Ji
AI4CE
75
0
0
03 Mar 2025
Generalization bounds for regression and classification on adaptive
  covering input domains
Generalization bounds for regression and classification on adaptive covering input domains
Wen-Liang Hwang
67
0
0
29 Jul 2024
Generalization Bound and New Algorithm for Clean-Label Backdoor Attack
Generalization Bound and New Algorithm for Clean-Label Backdoor Attack
Lijia Yu
Shuang Liu
Yibo Miao
Xiao-Shan Gao
Lijun Zhang
AAML
89
7
0
02 Jun 2024
Analysis of the rate of convergence of an over-parametrized
  convolutional neural network image classifier learned by gradient descent
Analysis of the rate of convergence of an over-parametrized convolutional neural network image classifier learned by gradient descent
Michael Kohler
A. Krzyżak
Benjamin Walter
79
1
0
13 May 2024
Improved Generalization Bounds for Communication Efficient Federated
  Learning
Improved Generalization Bounds for Communication Efficient Federated Learning
Peyman Gholami
H. Seferoglu
FedMLAI4CE
68
6
0
17 Apr 2024
Toward Understanding Generative Data Augmentation
Toward Understanding Generative Data Augmentation
Chenyu Zheng
Guoqiang Wu
Chongxuan Li
96
31
0
27 May 2023
More Communication Does Not Result in Smaller Generalization Error in
  Federated Learning
More Communication Does Not Result in Smaller Generalization Error in Federated Learning
Abdellatif Zaidi
Romain Chor
Milad Sefidgaran
FedMLAI4CE
99
10
0
24 Apr 2023
1