Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2007.04532
Cited By
A Study of Gradient Variance in Deep Learning
9 July 2020
Fartash Faghri
D. Duvenaud
David J. Fleet
Jimmy Ba
FedML
ODL
Re-assign community
ArXiv
PDF
HTML
Papers citing
"A Study of Gradient Variance in Deep Learning"
9 / 9 papers shown
Title
Multiple Importance Sampling for Stochastic Gradient Estimation
Corentin Salaün
Xingchang Huang
Iliyan Georgiev
Niloy J. Mitra
Gurprit Singh
24
1
0
22 Jul 2024
Critical Learning Periods: Leveraging Early Training Dynamics for Efficient Data Pruning
E. Chimoto
Jay Gala
Orevaoghene Ahia
Julia Kreutzer
Bruce A. Bassett
Sara Hooker
VLM
39
4
0
29 May 2024
Metadata Archaeology: Unearthing Data Subsets by Leveraging Training Dynamics
Shoaib Ahmed Siddiqui
Nitarshan Rajkumar
Tegan Maharaj
David M. Krueger
Sara Hooker
37
27
0
20 Sep 2022
On the Interpretability of Regularisation for Neural Networks Through Model Gradient Similarity
Vincent Szolnoky
Viktor Andersson
Balázs Kulcsár
Rebecka Jörnsten
37
5
0
25 May 2022
MSTGD:A Memory Stochastic sTratified Gradient Descent Method with an Exponential Convergence Rate
Aixiang Chen
Chen
Jinting Zhang
Zanbo Zhang
Zhihong Li
35
0
0
21 Feb 2022
On the Generalization of Models Trained with SGD: Information-Theoretic Bounds and Implications
Ziqiao Wang
Yongyi Mao
FedML
MLT
32
22
0
07 Oct 2021
Fishr: Invariant Gradient Variances for Out-of-Distribution Generalization
Alexandre Ramé
Corentin Dancette
Matthieu Cord
OOD
38
204
0
07 Sep 2021
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
281
2,888
0
15 Sep 2016
Efficient Per-Example Gradient Computations
Ian Goodfellow
186
74
0
07 Oct 2015
1