ResearchTrend.AI
  • Papers
  • Communities
  • Organizations
  • Events
  • Blog
  • Pricing
  • Feedback
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2109.08282
  4. Cited By
AdaLoss: A computationally-efficient and provably convergent adaptive
  gradient method

AdaLoss: A computationally-efficient and provably convergent adaptive gradient method

17 September 2021
Xiaoxia Wu
Yuege Xie
S. Du
Rachel A. Ward
    ODL
ArXiv (abs)PDFHTML

Papers citing "AdaLoss: A computationally-efficient and provably convergent adaptive gradient method"

2 / 2 papers shown
Title
Adaptively Weighted Data Augmentation Consistency Regularization for
  Robust Optimization under Concept Shift
Adaptively Weighted Data Augmentation Consistency Regularization for Robust Optimization under Concept Shift
Yijun Dong
Yuege Xie
Rachel A. Ward
OOD
150
1
0
04 Oct 2022
Provable Regret Bounds for Deep Online Learning and Control
Provable Regret Bounds for Deep Online Learning and Control
Xinyi Chen
Edgar Minasyan
Jason D. Lee
Elad Hazan
121
6
0
15 Oct 2021
1