19
13

On Rademacher Complexity-based Generalization Bounds for Deep Learning

Abstract

We show that the Rademacher complexity-based framework can establish non-vacuous generalization bounds for Convolutional Neural Networks (CNNs) in the context of classifying a small set of image classes. A key technical advancement is the formulation of novel contraction lemmas for high-dimensional mappings between vector spaces, specifically designed for general Lipschitz activation functions. These lemmas extend and refine the Talagrand contraction lemma across a broader range of scenarios. Our Rademacher complexity bound provides an enhancement over the results presented by Golowich et al. for ReLU-based Deep Neural Networks (DNNs). Moreover, while previous works utilizing Rademacher complexity have primarily focused on ReLU DNNs, our results generalize to a wider class of activation functions.

View on arXiv
@article{truong2025_2208.04284,
  title={ On Rademacher Complexity-based Generalization Bounds for Deep Learning },
  author={ Lan V. Truong },
  journal={arXiv preprint arXiv:2208.04284},
  year={ 2025 }
}
Comments on this paper