ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2011.05348
  4. Cited By
SALR: Sharpness-aware Learning Rate Scheduler for Improved
  Generalization
v1v2 (latest)

SALR: Sharpness-aware Learning Rate Scheduler for Improved Generalization

10 November 2020
Xubo Yue
Maher Nouiehed
Raed Al Kontar
    ODL
ArXiv (abs)PDFHTML

Papers citing "SALR: Sharpness-aware Learning Rate Scheduler for Improved Generalization"

3 / 3 papers shown
Adaptively Sampling-Reusing-Mixing Decomposed Gradients to Speed Up Sharpness Aware Minimization
Adaptively Sampling-Reusing-Mixing Decomposed Gradients to Speed Up Sharpness Aware Minimization
Jiaxin Deng
Junbiao Pang
164
0
0
04 Oct 2025
Good regularity creates large learning rate implicit biases: edge of
  stability, balancing, and catapult
Good regularity creates large learning rate implicit biases: edge of stability, balancing, and catapult
Yuqing Wang
Zhenghao Xu
Tuo Zhao
Molei Tao
315
16
0
26 Oct 2023
Wide-minima Density Hypothesis and the Explore-Exploit Learning Rate
  Schedule
Wide-minima Density Hypothesis and the Explore-Exploit Learning Rate ScheduleJournal of machine learning research (JMLR), 2020
Nikhil Iyer
V. Thejas
Nipun Kwatra
Ramachandran Ramjee
Muthian Sivathanu
299
34
0
09 Mar 2020
1