ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2402.01621
  4. Cited By
Stochastic Two Points Method for Deep Model Zeroth-order Optimization

Stochastic Two Points Method for Deep Model Zeroth-order Optimization

2 February 2024
Yijiang Pang
Jiayu Zhou
ArXivPDFHTML

Papers citing "Stochastic Two Points Method for Deep Model Zeroth-order Optimization"

5 / 5 papers shown
Title
Just One Byte (per gradient): A Note on Low-Bandwidth Decentralized
  Language Model Finetuning Using Shared Randomness
Just One Byte (per gradient): A Note on Low-Bandwidth Decentralized Language Model Finetuning Using Shared Randomness
E. Zelikman
Qian Huang
Percy Liang
Nick Haber
Noah D. Goodman
62
14
0
16 Jun 2023
Convergence of Adam Under Relaxed Assumptions
Convergence of Adam Under Relaxed Assumptions
Haochuan Li
Alexander Rakhlin
Ali Jadbabaie
29
53
0
27 Apr 2023
Noise Is Not the Main Factor Behind the Gap Between SGD and Adam on
  Transformers, but Sign Descent Might Be
Noise Is Not the Main Factor Behind the Gap Between SGD and Adam on Transformers, but Sign Descent Might Be
Frederik Kunstner
Jacques Chen
J. Lavington
Mark W. Schmidt
40
66
0
27 Apr 2023
The Power of Scale for Parameter-Efficient Prompt Tuning
The Power of Scale for Parameter-Efficient Prompt Tuning
Brian Lester
Rami Al-Rfou
Noah Constant
VPVLM
280
3,835
0
18 Apr 2021
Densely Connected Convolutional Networks
Densely Connected Convolutional Networks
Gao Huang
Zhuang Liu
L. V. D. van der Maaten
Kilian Q. Weinberger
PINN
3DV
247
36,237
0
25 Aug 2016
1