ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2302.01029
  4. Cited By
On Suppressing Range of Adaptive Stepsizes of Adam to Improve
  Generalisation Performance
v1v2v3 (latest)

On Suppressing Range of Adaptive Stepsizes of Adam to Improve Generalisation Performance

2 February 2023
Guoqiang Zhang
    ODL
ArXiv (abs)PDFHTML

Papers citing "On Suppressing Range of Adaptive Stepsizes of Adam to Improve Generalisation Performance"

3 / 3 papers shown
Title
AttentionX: Exploiting Consensus Discrepancy In Attention from A
  Distributed Optimization Perspective
AttentionX: Exploiting Consensus Discrepancy In Attention from A Distributed Optimization Perspective
Guoqiang Zhang
Richard Heusdens
63
0
0
06 Sep 2024
On Exact Bit-level Reversible Transformers Without Changing
  Architectures
On Exact Bit-level Reversible Transformers Without Changing Architectures
Guoqiang Zhang
J. P. Lewis
W. Kleijn
MQAI4CE
79
0
0
12 Jul 2024
On Accelerating Diffusion-Based Sampling Process via Improved
  Integration Approximation
On Accelerating Diffusion-Based Sampling Process via Improved Integration Approximation
Guoqiang Zhang
Niwa Kenta
W. Kleijn
DiffM
85
8
0
22 Apr 2023
1