ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1910.00359
  4. Cited By
Truth or Backpropaganda? An Empirical Investigation of Deep Learning
  Theory

Truth or Backpropaganda? An Empirical Investigation of Deep Learning Theory

1 October 2019
Micah Goldblum
Jonas Geiping
Avi Schwarzschild
Michael Moeller
Tom Goldstein
ArXivPDFHTML

Papers citing "Truth or Backpropaganda? An Empirical Investigation of Deep Learning Theory"

9 / 9 papers shown
Title
Just How Flexible are Neural Networks in Practice?
Just How Flexible are Neural Networks in Practice?
Ravid Shwartz-Ziv
Micah Goldblum
Arpit Bansal
C. Bayan Bruss
Yann LeCun
Andrew Gordon Wilson
43
4
0
17 Jun 2024
Koopman-based generalization bound: New aspect for full-rank weights
Koopman-based generalization bound: New aspect for full-rank weights
Yuka Hashimoto
Sho Sonoda
Isao Ishikawa
Atsushi Nitanda
Taiji Suzuki
11
2
0
12 Feb 2023
Robbing the Fed: Directly Obtaining Private Data in Federated Learning
  with Modified Models
Robbing the Fed: Directly Obtaining Private Data in Federated Learning with Modified Models
Liam H. Fowl
Jonas Geiping
W. Czaja
Micah Goldblum
Tom Goldstein
FedML
38
145
0
25 Oct 2021
Stochastic Training is Not Necessary for Generalization
Stochastic Training is Not Necessary for Generalization
Jonas Geiping
Micah Goldblum
Phillip E. Pope
Michael Moeller
Tom Goldstein
89
72
0
29 Sep 2021
A linearized framework and a new benchmark for model selection for
  fine-tuning
A linearized framework and a new benchmark for model selection for fine-tuning
Aditya Deshpande
Alessandro Achille
Avinash Ravichandran
Hao Li
L. Zancato
Charless C. Fowlkes
Rahul Bhotika
Stefano Soatto
Pietro Perona
ALM
115
46
0
29 Jan 2021
Ramifications of Approximate Posterior Inference for Bayesian Deep
  Learning in Adversarial and Out-of-Distribution Settings
Ramifications of Approximate Posterior Inference for Bayesian Deep Learning in Adversarial and Out-of-Distribution Settings
John Mitros
A. Pakrashi
Brian Mac Namee
UQCV
26
2
0
03 Sep 2020
Predicting Training Time Without Training
Predicting Training Time Without Training
L. Zancato
Alessandro Achille
Avinash Ravichandran
Rahul Bhotika
Stefano Soatto
26
24
0
28 Aug 2020
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp
  Minima
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
308
2,890
0
15 Sep 2016
The Loss Surfaces of Multilayer Networks
The Loss Surfaces of Multilayer Networks
A. Choromańska
Mikael Henaff
Michaël Mathieu
Gerard Ben Arous
Yann LeCun
ODL
183
1,185
0
30 Nov 2014
1