ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1608.08225
  4. Cited By
Why does deep and cheap learning work so well?
v1v2v3v4 (latest)

Why does deep and cheap learning work so well?

29 August 2016
Henry W. Lin
Max Tegmark
David Rolnick
ArXiv (abs)PDFHTML

Papers citing "Why does deep and cheap learning work so well?"

50 / 236 papers shown
Deep Neural Networks as the Semi-classical Limit of Topological Quantum
  Neural Networks: The problem of generalisation
Deep Neural Networks as the Semi-classical Limit of Topological Quantum Neural Networks: The problem of generalisation
A. Marcianò
De-Wei Chen
Filippo Fabrocini
C. Fields
M. Lulli
Emanuele Zappala
GNN
114
6
0
25 Oct 2022
Precision Machine Learning
Precision Machine Learning
Eric J. Michaud
Ziming Liu
Max Tegmark
166
40
0
24 Oct 2022
When Expressivity Meets Trainability: Fewer than $n$ Neurons Can Work
When Expressivity Meets Trainability: Fewer than nnn Neurons Can WorkNeural Information Processing Systems (NeurIPS), 2022
Jiawei Zhang
Yushun Zhang
Mingyi Hong
Tian Ding
Jianfeng Yao
323
10
0
21 Oct 2022
Why neural networks find simple solutions: the many regularizers of
  geometric complexity
Why neural networks find simple solutions: the many regularizers of geometric complexityNeural Information Processing Systems (NeurIPS), 2022
Benoit Dherin
Michael Munn
M. Rosca
David Barrett
340
42
0
27 Sep 2022
Three Learning Stages and Accuracy-Efficiency Tradeoff of Restricted
  Boltzmann Machines
Three Learning Stages and Accuracy-Efficiency Tradeoff of Restricted Boltzmann MachinesNature Communications (Nat Commun), 2022
Lennart Dabelow
Masahito Ueda
222
11
0
02 Sep 2022
Gaussian Process Surrogate Models for Neural Networks
Gaussian Process Surrogate Models for Neural NetworksConference on Uncertainty in Artificial Intelligence (UAI), 2022
Michael Y. Li
Erin Grant
Thomas Griffiths
BDLSyDa
251
9
0
11 Aug 2022
Image sensing with multilayer, nonlinear optical neural networks
Image sensing with multilayer, nonlinear optical neural networksNature Photonics (Nat. Photonics), 2022
Tianyu Wang
Mandar M. Sohoni
Logan G. Wright
Martin M. Stein
Shifan Ma
Tatsuhiro Onodera
Maxwell G. Anderson
Peter L. McMahon
159
224
0
27 Jul 2022
Wavelet Conditional Renormalization Group
Wavelet Conditional Renormalization GroupPhysical Review X (PRX), 2022
Tanguy Marchand
M. Ozawa
Giulio Biroli
S. Mallat
118
21
0
11 Jul 2022
Advanced Transient Diagnostic with Ensemble Digital Twin Modeling
Advanced Transient Diagnostic with Ensemble Digital Twin Modeling
Edward Chen
Linyu Lin
Truc-Nam Dinh
62
4
0
23 May 2022
Towards understanding deep learning with the natural clustering prior
Towards understanding deep learning with the natural clustering prior
Simon Carbonnelle
176
0
0
15 Mar 2022
Categorical Representation Learning and RG flow operators for
  algorithmic classifiers
Categorical Representation Learning and RG flow operators for algorithmic classifiers
A. Sheshmani
Yi-Zhuang You
Wenbo Fu
A. Azizi
AI4CE
66
4
0
15 Mar 2022
Identifying equivalent Calabi--Yau topologies: A discrete challenge from
  math and physics for machine learning
Identifying equivalent Calabi--Yau topologies: A discrete challenge from math and physics for machine learning
Vishnu Jejjala
W. Taylor
Andrew P. Turner
257
8
0
15 Feb 2022
Decomposing neural networks as mappings of correlation functions
Decomposing neural networks as mappings of correlation functionsPhysical Review Research (Phys. Rev. Res.), 2022
Kirsten Fischer
Alexandre René
Christian Keup
Moritz Layer
David Dahmen
M. Helias
FAtt
229
19
0
10 Feb 2022
Complexity from Adaptive-Symmetries Breaking: Global Minima in the
  Statistical Mechanics of Deep Neural Networks
Complexity from Adaptive-Symmetries Breaking: Global Minima in the Statistical Mechanics of Deep Neural Networks
Shaun Li
AI4CE
226
1
0
03 Jan 2022
Explicitly antisymmetrized neural network layers for variational Monte
  Carlo simulation
Explicitly antisymmetrized neural network layers for variational Monte Carlo simulation
Jeffmin Lin
Gil Goldshlager
Lin Lin
197
29
0
07 Dec 2021
Error Bounds for a Matrix-Vector Product Approximation with Deep ReLU
  Neural Networks
Error Bounds for a Matrix-Vector Product Approximation with Deep ReLU Neural Networks
T. Getu
202
2
0
25 Nov 2021
Feature extraction of machine learning and phase transition point of
  Ising model
Feature extraction of machine learning and phase transition point of Ising model
S. Funai
130
3
0
22 Nov 2021
Universality of Winning Tickets: A Renormalization Group Perspective
Universality of Winning Tickets: A Renormalization Group Perspective
William T. Redman
Tianlong Chen
Zinan Lin
Akshunna S. Dogra
UQCV
277
8
0
07 Oct 2021
Impossibility Results in AI: A Survey
Impossibility Results in AI: A SurveyACM Computing Surveys (CSUR), 2021
Mario Brčič
Roman V. Yampolskiy
338
29
0
01 Sep 2021
Towards quantifying information flows: relative entropy in deep neural
  networks and the renormalization group
Towards quantifying information flows: relative entropy in deep neural networks and the renormalization groupSciPost Physics (SciPost Phys.), 2021
J. Erdmenger
Kevin T. Grosvenor
R. Jefferson
164
23
0
14 Jul 2021
Entropy Regularized Reinforcement Learning Using Large Deviation Theory
Entropy Regularized Reinforcement Learning Using Large Deviation TheoryPhysical Review Research (Phys. Rev. Res.), 2021
A. Arriojas
Jacob Adamczyk
Stas Tiomkin
R. Kulkarni
AI4CE
79
6
0
07 Jun 2021
Reverse Engineering the Neural Tangent Kernel
Reverse Engineering the Neural Tangent KernelInternational Conference on Machine Learning (ICML), 2021
James B. Simon
Sajant Anand
M. DeWeese
362
13
0
06 Jun 2021
Machine-Learning Non-Conservative Dynamics for New-Physics Detection
Machine-Learning Non-Conservative Dynamics for New-Physics DetectionPhysical Review E (PRE), 2021
Ziming Liu
Bohan Wang
Qi Meng
Wei Chen
M. Tegmark
Tie-Yan Liu
PINNAI4CE
385
19
0
31 May 2021
MAGI-X: Manifold-Constrained Gaussian Process Inference for Unknown
  System Dynamics
MAGI-X: Manifold-Constrained Gaussian Process Inference for Unknown System Dynamics
Chaofan Huang
Simin Ma
Shihao Yang
165
0
0
27 May 2021
Apply Artificial Neural Network to Solving Manpower Scheduling Problem
Apply Artificial Neural Network to Solving Manpower Scheduling Problem
Tianyu Liu
Lingyu Zhang
84
2
0
07 May 2021
Deep physical neural networks enabled by a backpropagation algorithm for
  arbitrary physical systems
Deep physical neural networks enabled by a backpropagation algorithm for arbitrary physical systems
Logan G. Wright
Tatsuhiro Onodera
Martin M. Stein
Tianyu Wang
Darren T. Schachter
Zoey Hu
Peter L. McMahon
PINNAI4CE
249
625
0
27 Apr 2021
On the approximation of functions by tanh neural networks
On the approximation of functions by tanh neural networksNeural Networks (NN), 2021
Tim De Ryck
S. Lanthaler
Siddhartha Mishra
264
172
0
18 Apr 2021
The Autodidactic Universe
The Autodidactic Universe
S. Alexander
W. Cunningham
J. Lanier
L. Smolin
S. Stanojevic
M. Toomey
D. Wecker
AI4CE
232
21
0
29 Mar 2021
Tensor networks and efficient descriptions of classical data
Tensor networks and efficient descriptions of classical data
Sirui Lu
Márton Kanász-Nagy
I. Kukuljan
J. I. Cirac
150
34
0
11 Mar 2021
Why flatness does and does not correlate with generalization for deep
  neural networks
Why flatness does and does not correlate with generalization for deep neural networks
Shuo Zhang
Isaac Reid
Guillermo Valle Pérez
A. Louis
285
10
0
10 Mar 2021
Deep ReLU Networks Preserve Expected Length
Deep ReLU Networks Preserve Expected LengthInternational Conference on Learning Representations (ICLR), 2021
Boris Hanin
Ryan Jeong
David Rolnick
147
15
0
21 Feb 2021
Information contraction in noisy binary neural networks and its
  implications
Information contraction in noisy binary neural networks and its implications
Chuteng Zhou
Quntao Zhuang
Matthew Mattina
P. Whatmough
138
3
0
28 Jan 2021
Advances in Electron Microscopy with Deep Learning
Advances in Electron Microscopy with Deep Learning
Jeffrey M. Ede
661
3
0
04 Jan 2021
The Representation Power of Neural Networks: Breaking the Curse of
  Dimensionality
The Representation Power of Neural Networks: Breaking the Curse of Dimensionality
Moise Blanchard
M. A. Bennouna
157
7
0
10 Dec 2020
Why Unsupervised Deep Networks Generalize
Why Unsupervised Deep Networks Generalize
Anita de Mello Koch
E. Koch
R. Koch
OOD
152
8
0
07 Dec 2020
MixMix: All You Need for Data-Free Compression Are Feature and Data
  Mixing
MixMix: All You Need for Data-Free Compression Are Feature and Data MixingIEEE International Conference on Computer Vision (ICCV), 2020
Yuhang Li
Feng Zhu
Yazhe Niu
Mingzhu Shen
Xin Dong
F. Yu
Shaoqing Lu
Shi Gu
MQ
210
51
0
19 Nov 2020
A Study of Policy Gradient on a Class of Exactly Solvable Models
A Study of Policy Gradient on a Class of Exactly Solvable Models
Gavin McCracken
Colin Daniels
Rosie Zhao
Anna M. Brandenberger
Prakash Panangaden
Doina Precup
143
0
0
03 Nov 2020
Physics-Based Deep Learning for Fiber-Optic Communication Systems
Physics-Based Deep Learning for Fiber-Optic Communication SystemsIEEE Journal on Selected Areas in Communications (JSAC), 2020
Christian Hager
H. Pfister
186
79
0
27 Oct 2020
A Probabilistic Representation of Deep Learning for Improving The
  Information Theoretic Interpretability
A Probabilistic Representation of Deep Learning for Improving The Information Theoretic Interpretability
Xinjie Lan
Kenneth Barner
FAtt
127
2
0
27 Oct 2020
Provable Memorization via Deep Neural Networks using Sub-linear
  Parameters
Provable Memorization via Deep Neural Networks using Sub-linear ParametersAnnual Conference Computational Learning Theory (COLT), 2020
Sejun Park
Jaeho Lee
Chulhee Yun
Jinwoo Shin
FedMLMDE
165
43
0
26 Oct 2020
RG-Flow: A hierarchical and explainable flow model based on
  renormalization group and sparse prior
RG-Flow: A hierarchical and explainable flow model based on renormalization group and sparse prior
Hong-Ye Hu
Dian Wu
Yi-Zhuang You
Bruno A. Olshausen
Yubei Chen
BDLDRL
418
18
0
30 Sep 2020
Implicit Gradient Regularization
Implicit Gradient RegularizationInternational Conference on Learning Representations (ICLR), 2020
David Barrett
Benoit Dherin
364
170
0
23 Sep 2020
Review: Deep Learning in Electron Microscopy
Review: Deep Learning in Electron Microscopy
Jeffrey M. Ede
912
90
0
17 Sep 2020
Supervised Learning with Projected Entangled Pair States
Supervised Learning with Projected Entangled Pair StatesPhysical review B (PRB), 2020
Song Cheng
Lei Wang
Pan Zhang
123
59
0
12 Sep 2020
On Representing (Anti)Symmetric Functions
On Representing (Anti)Symmetric Functions
Marcus Hutter
112
25
0
30 Jul 2020
Measurement error models: from nonparametric methods to deep neural
  networks
Measurement error models: from nonparametric methods to deep neural networksStatistical Science (Statist. Sci.), 2020
Zhirui Hu
Z. Ke
Jun S. Liu
93
4
0
15 Jul 2020
Human $\neq$ AGI
Human ≠\neq= AGI
Roman V. Yampolskiy
AI4CE
171
1
0
11 Jul 2020
Modeling Generalization in Machine Learning: A Methodological and
  Computational Study
Modeling Generalization in Machine Learning: A Methodological and Computational Study
Pietro Barbiero
Giovanni Squillero
Alberto Tonda
63
47
0
28 Jun 2020
Thermodynamic Machine Learning through Maximum Work Production
Thermodynamic Machine Learning through Maximum Work Production
A. B. Boyd
James P. Crutchfield
M. Gu
AI4CE
421
19
0
27 Jun 2020
Is SGD a Bayesian sampler? Well, almost
Is SGD a Bayesian sampler? Well, almost
Chris Mingard
Guillermo Valle Pérez
Joar Skalse
A. Louis
BDL
270
62
0
26 Jun 2020
Previous
12345
Next