ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.08353
26
0

An Adaptive Method Stabilizing Activations for Enhanced Generalization

10 June 2025
Hyunseok Seung
Jaewoo Lee
Hyunsuk Ko
    ODL
ArXiv (abs)PDFHTML
Abstract

We introduce AdaAct, a novel optimization algorithm that adjusts learning rates according to activation variance. Our method enhances the stability of neuron outputs by incorporating neuron-wise adaptivity during the training process, which subsequently leads to better generalization -- a complementary approach to conventional activation regularization methods. Experimental results demonstrate AdaAct's competitive performance across standard image classification benchmarks. We evaluate AdaAct on CIFAR and ImageNet, comparing it with other state-of-the-art methods. Importantly, AdaAct effectively bridges the gap between the convergence speed of Adam and the strong generalization capabilities of SGD, all while maintaining competitive execution times. Code is available atthis https URL.

View on arXiv
@article{seung2025_2506.08353,
  title={ An Adaptive Method Stabilizing Activations for Enhanced Generalization },
  author={ Hyunseok Seung and Jaewoo Lee and Hyunsuk Ko },
  journal={arXiv preprint arXiv:2506.08353},
  year={ 2025 }
}
Comments on this paper