ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2305.01143
  4. Cited By
Understanding the Generalization Ability of Deep Learning Algorithms: A
  Kernelized Renyi's Entropy Perspective

Understanding the Generalization Ability of Deep Learning Algorithms: A Kernelized Renyi's Entropy Perspective

2 May 2023
Yuxin Dong
Tieliang Gong
H. Chen
Chen Li
ArXivPDFHTML

Papers citing "Understanding the Generalization Ability of Deep Learning Algorithms: A Kernelized Renyi's Entropy Perspective"

4 / 4 papers shown
Title
Entropy-based Guidance of Deep Neural Networks for Accelerated
  Convergence and Improved Performance
Entropy-based Guidance of Deep Neural Networks for Accelerated Convergence and Improved Performance
Mackenzie J. Meni
Ryan T. White
Michael L. Mayo
K. Pilkiewicz
BDL
22
3
0
28 Aug 2023
On the Generalization of Models Trained with SGD: Information-Theoretic
  Bounds and Implications
On the Generalization of Models Trained with SGD: Information-Theoretic Bounds and Implications
Ziqiao Wang
Yongyi Mao
FedML
MLT
32
22
0
07 Oct 2021
Information-theoretic generalization bounds for black-box learning
  algorithms
Information-theoretic generalization bounds for black-box learning algorithms
Hrayr Harutyunyan
Maxim Raginsky
Greg Ver Steeg
Aram Galstyan
32
41
0
04 Oct 2021
Information-Theoretic Generalization Bounds for SGLD via Data-Dependent
  Estimates
Information-Theoretic Generalization Bounds for SGLD via Data-Dependent Estimates
Jeffrey Negrea
Mahdi Haghifam
Gintare Karolina Dziugaite
Ashish Khisti
Daniel M. Roy
FedML
110
146
0
06 Nov 2019
1