ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2306.12747
  4. Cited By
Don't be so Monotone: Relaxing Stochastic Line Search in
  Over-Parameterized Models

Don't be so Monotone: Relaxing Stochastic Line Search in Over-Parameterized Models

22 June 2023
Leonardo Galli
Holger Rauhut
Mark W. Schmidt
ArXivPDFHTML

Papers citing "Don't be so Monotone: Relaxing Stochastic Line Search in Over-Parameterized Models"

4 / 4 papers shown
Title
Convergence Conditions for Stochastic Line Search Based Optimization of
  Over-parametrized Models
Convergence Conditions for Stochastic Line Search Based Optimization of Over-parametrized Models
Matteo Lapucci
Davide Pucci
21
1
0
06 Aug 2024
An adaptively inexact first-order method for bilevel optimization with application to hyperparameter learning
An adaptively inexact first-order method for bilevel optimization with application to hyperparameter learning
Mohammad Salehi
Subhadip Mukherjee
Lindon Roberts
Matthias Joachim Ehrhardt
24
5
0
19 Aug 2023
Noise Is Not the Main Factor Behind the Gap Between SGD and Adam on
  Transformers, but Sign Descent Might Be
Noise Is Not the Main Factor Behind the Gap Between SGD and Adam on Transformers, but Sign Descent Might Be
Frederik Kunstner
Jacques Chen
J. Lavington
Mark W. Schmidt
40
67
0
27 Apr 2023
Densely Connected Convolutional Networks
Densely Connected Convolutional Networks
Gao Huang
Zhuang Liu
L. V. D. van der Maaten
Kilian Q. Weinberger
PINN
3DV
247
36,237
0
25 Aug 2016
1