ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2206.07199
  4. Cited By
Benefits of Additive Noise in Composing Classes with Bounded Capacity

Benefits of Additive Noise in Composing Classes with Bounded Capacity

Neural Information Processing Systems (NeurIPS), 2022
14 June 2022
A. F. Pour
H. Ashtiani
ArXiv (abs)PDFHTMLGithub

Papers citing "Benefits of Additive Noise in Composing Classes with Bounded Capacity"

3 / 3 papers shown
Beyond Universal Approximation Theorems: Algorithmic Uniform Approximation by Neural Networks Trained with Noisy Data
Beyond Universal Approximation Theorems: Algorithmic Uniform Approximation by Neural Networks Trained with Noisy Data
Anastasis Kratsios
Tin Sum Cheng
Daniel Roy
AAML
169
0
0
31 Aug 2025
On the Role of Noise in the Sample Complexity of Learning Recurrent
  Neural Networks: Exponential Gaps for Long Sequences
On the Role of Noise in the Sample Complexity of Learning Recurrent Neural Networks: Exponential Gaps for Long SequencesNeural Information Processing Systems (NeurIPS), 2023
A. F. Pour
H. Ashtiani
173
0
0
28 May 2023
Limitations of Information-Theoretic Generalization Bounds for Gradient
  Descent Methods in Stochastic Convex Optimization
Limitations of Information-Theoretic Generalization Bounds for Gradient Descent Methods in Stochastic Convex OptimizationInternational Conference on Algorithmic Learning Theory (ALT), 2022
Mahdi Haghifam
Borja Rodríguez Gálvez
Ragnar Thobaben
Mikael Skoglund
Daniel M. Roy
Gintare Karolina Dziugaite
341
20
0
27 Dec 2022
1