ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2001.05142
  4. Cited By
Theoretical Interpretation of Learned Step Size in Deep-Unfolded
  Gradient Descent
v1v2 (latest)

Theoretical Interpretation of Learned Step Size in Deep-Unfolded Gradient Descent

15 January 2020
Satoshi Takabe
Tadashi Wadayama
ArXiv (abs)PDFHTML

Papers citing "Theoretical Interpretation of Learned Step Size in Deep-Unfolded Gradient Descent"

5 / 5 papers shown
Learning Variational Models with Unrolling and Bilevel Optimization
Learning Variational Models with Unrolling and Bilevel Optimization
Christoph Brauer
Niklas Breustedt
T. Wolff
D. Lorenz
SSL
358
5
0
26 Sep 2022
One-Bit Compressive Sensing: Can We Go Deep and Blind?
One-Bit Compressive Sensing: Can We Go Deep and Blind?IEEE Signal Processing Letters (SPL), 2022
Yiming Zeng
Shahin Khobahi
M. Soltanalian
237
6
0
13 Mar 2022
Symbolic Learning to Optimize: Towards Interpretability and Scalability
Symbolic Learning to Optimize: Towards Interpretability and ScalabilityInternational Conference on Learning Representations (ICLR), 2022
Wenqing Zheng
Tianlong Chen
Ting-Kuei Hu
Zinan Lin
464
21
0
13 Mar 2022
A Design Space Study for LISTA and Beyond
A Design Space Study for LISTA and BeyondInternational Conference on Learning Representations (ICLR), 2021
Tianjian Meng
Xiaohan Chen
Lezhi Li
Zinan Lin
192
3
0
08 Apr 2021
Learning to Optimize: A Primer and A Benchmark
Learning to Optimize: A Primer and A BenchmarkJournal of machine learning research (JMLR), 2021
Tianlong Chen
Xiaohan Chen
Wuyang Chen
Howard Heaton
Jialin Liu
Zinan Lin
W. Yin
590
307
0
23 Mar 2021
1
Page 1 of 1