Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2306.09778
Cited By
Gradient is All You Need?
16 June 2023
Konstantin Riedl
T. Klock
Carina Geldhauser
M. Fornasier
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Gradient is All You Need?"
6 / 6 papers shown
Title
FedCBO: Reaching Group Consensus in Clustered Federated Learning through Consensus-based Optimization
J. Carrillo
Nicolas García Trillos
Sixu Li
Yuhua Zhu
FedML
29
17
0
04 May 2023
Leveraging Memory Effects and Gradient Information in Consensus-Based Optimization: On Global Convergence in Mean-Field Law
Konstantin Riedl
24
14
0
22 Nov 2022
Sparsity in Deep Learning: Pruning and growth for efficient inference and training in neural networks
Torsten Hoefler
Dan Alistarh
Tal Ben-Nun
Nikoli Dryden
Alexandra Peste
MQ
141
684
0
31 Jan 2021
Trainability and Accuracy of Neural Networks: An Interacting Particle System Approach
Grant M. Rotskoff
Eric Vanden-Eijnden
59
118
0
02 May 2018
Linear Convergence of Gradient and Proximal-Gradient Methods Under the Polyak-Łojasiewicz Condition
Hamed Karimi
J. Nutini
Mark W. Schmidt
139
1,199
0
16 Aug 2016
The Loss Surfaces of Multilayer Networks
A. Choromańska
Mikael Henaff
Michaël Mathieu
Gerard Ben Arous
Yann LeCun
ODL
179
1,185
0
30 Nov 2014
1