ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1608.08225
  4. Cited By
Why does deep and cheap learning work so well?
v1v2v3v4 (latest)

Why does deep and cheap learning work so well?

29 August 2016
Henry W. Lin
Max Tegmark
David Rolnick
ArXiv (abs)PDFHTML

Papers citing "Why does deep and cheap learning work so well?"

50 / 236 papers shown
Title
Maximum Multiscale Entropy and Neural Network Regularization
Maximum Multiscale Entropy and Neural Network Regularization
Amir-Reza Asadi
Emmanuel Abbe
92
1
0
25 Jun 2020
Predicting First Passage Percolation Shapes Using Neural Networks
Predicting First Passage Percolation Shapes Using Neural Networks
Sebastian Rosengren
AI4CE
54
0
0
24 Jun 2020
Hierarchically Compositional Tasks and Deep Convolutional Networks
Hierarchically Compositional Tasks and Deep Convolutional NetworksJournal of Vision (J Vis), 2020
Arturo Deza
Q. Liao
Andrzej Banburski
T. Poggio
BDLOOD
183
2
0
24 Jun 2020
Restricted Boltzmann Machine Flows and The Critical Temperature of Ising
  models
Restricted Boltzmann Machine Flows and The Critical Temperature of Ising models
R. Veiga
R. Vicente
AI4CE
117
5
0
17 Jun 2020
PAC-Bayesian Generalization Bounds for MultiLayer Perceptrons
PAC-Bayesian Generalization Bounds for MultiLayer Perceptrons
Xinjie Lan
Xin Guo
Kenneth Barner
175
3
0
16 Jun 2020
Minimum Width for Universal Approximation
Minimum Width for Universal Approximation
Sejun Park
Chulhee Yun
Jaeho Lee
Jinwoo Shin
159
138
0
16 Jun 2020
Tangent Space Sensitivity and Distribution of Linear Regions in ReLU
  Networks
Tangent Space Sensitivity and Distribution of Linear Regions in ReLU Networks
Balint Daroczy
AAML
91
0
0
11 Jun 2020
Machine Learning for Condensed Matter Physics
Machine Learning for Condensed Matter Physics
Edwin Bedolla
L. C. Padierna
R. Castañeda-Priego
AI4CE
235
83
0
28 May 2020
PDE constraints on smooth hierarchical functions computed by neural
  networks
PDE constraints on smooth hierarchical functions computed by neural networks
Khashayar Filom
Konrad Paul Kording
Roozbeh Farhoodi
103
0
0
18 May 2020
How hard is to distinguish graphs with graph neural networks?
How hard is to distinguish graphs with graph neural networks?
Andreas Loukas
GNN
235
6
0
13 May 2020
Boosting on the shoulders of giants in quantum device calibration
Boosting on the shoulders of giants in quantum device calibration
A. Wozniakowski
Jayne Thompson
M. Gu
F. Binder
130
4
0
13 May 2020
Off-the-shelf deep learning is not enough: parsimony, Bayes and
  causality
Off-the-shelf deep learning is not enough: parsimony, Bayes and causality
Rama K Vasudevan
M. Ziatdinov
L. Vlček
Sergei V. Kalinin
BDLCMLAI4CE
60
0
0
04 May 2020
Bias-corrected estimator for intrinsic dimension and differential
  entropy--a visual multiscale approach
Bias-corrected estimator for intrinsic dimension and differential entropy--a visual multiscale approachJournal of Communication and Information Systems (JCIS), 2020
J. Filho
J. Canuto
Luiz Miranda
81
0
0
30 Apr 2020
Random Features for Kernel Approximation: A Survey on Algorithms,
  Theory, and Beyond
Random Features for Kernel Approximation: A Survey on Algorithms, Theory, and Beyond
Fanghui Liu
Xiaolin Huang
Yudong Chen
Johan A. K. Suykens
BDL
441
188
0
23 Apr 2020
Towards a theory of machine learning
Towards a theory of machine learning
V. Vanchurin
194
27
0
15 Apr 2020
Adaptive Partial Scanning Transmission Electron Microscopy with
  Reinforcement Learning
Adaptive Partial Scanning Transmission Electron Microscopy with Reinforcement Learning
Jeffrey M. Ede
727
14
0
06 Apr 2020
Depth Selection for Deep ReLU Nets in Feature Extraction and
  Generalization
Depth Selection for Deep ReLU Nets in Feature Extraction and GeneralizationIEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI), 2020
Zhi Han
Siquan Yu
Shao-Bo Lin
Ding-Xuan Zhou
OOD
128
45
0
01 Apr 2020
Comments on Sejnowski's "The unreasonable effectiveness of deep learning
  in artificial intelligence" [arXiv:2002.04806]
Comments on Sejnowski's "The unreasonable effectiveness of deep learning in artificial intelligence" [arXiv:2002.04806]
L. Smith
53
0
0
20 Mar 2020
Warwick Electron Microscopy Datasets
Warwick Electron Microscopy Datasets
Jeffrey M. Ede
225
14
0
02 Mar 2020
A closer look at the approximation capabilities of neural networks
A closer look at the approximation capabilities of neural networksInternational Conference on Learning Representations (ICLR), 2020
Kai Fong Ernest Chong
64
17
0
16 Feb 2020
Neural network wave functions and the sign problem
Neural network wave functions and the sign problemPhysical Review Research (PRResearch), 2020
A. Szabó
C. Castelnovo
164
80
0
11 Feb 2020
Short sighted deep learning
Short sighted deep learningPhysical Review E (PRE), 2020
R. Koch
Anita de Mello Koch
Nicholas Kastanos
Ling Cheng
143
8
0
07 Feb 2020
Differentiable programming and its applications to dynamical systems
Differentiable programming and its applications to dynamical systems
A. Hernández
José M. Amigó
162
11
0
17 Dec 2019
Realization of spatial sparseness by deep ReLU nets with massive data
Realization of spatial sparseness by deep ReLU nets with massive dataIEEE Transactions on Neural Networks and Learning Systems (TNNLS), 2019
C. Chui
Shao-Bo Lin
Bo Zhang
Ding-Xuan Zhou
106
24
0
16 Dec 2019
Parameters Estimation for the Cosmic Microwave Background with Bayesian
  Neural Networks
Parameters Estimation for the Cosmic Microwave Background with Bayesian Neural Networks
Héctor J. Hortúa
Riccardo Volpi
D. Marinelli
Luigi Malagò
BDL
209
25
0
19 Nov 2019
Generalization in multitask deep neural classifiers: a statistical
  physics approach
Generalization in multitask deep neural classifiers: a statistical physics approachNeural Information Processing Systems (NeurIPS), 2019
Tyler Lee
A. Ndirango
AI4CE
245
20
0
30 Oct 2019
CTNN: Corticothalamic-inspired neural network
CTNN: Corticothalamic-inspired neural network
Leendert A. Remmelzwaal
A. Mishra
George F. R. Ellis
OOD
162
1
0
28 Oct 2019
Face representation by deep learning: a linear encoding in a parameter
  space?
Face representation by deep learning: a linear encoding in a parameter space?
Qiulei Dong
Qiulei Dong
Zhanyi Hu
CVBM
181
1
0
22 Oct 2019
A Flexible Framework for Anomaly Detection via Dimensionality Reduction
A Flexible Framework for Anomaly Detection via Dimensionality Reduction
A. V. Sadr
Bruce A. Bassett
M. Kunz
126
15
0
09 Sep 2019
Sparse hierarchical representation learning on molecular graphs
Sparse hierarchical representation learning on molecular graphs
M. Bal
Hagen Triendl
Mariana Assmann
M. Craig
Lawrence Phillips
J. Frost
Usman Bashir
Noor Shaker
V. Stojevic
GNN
95
1
0
06 Aug 2019
A direct approach for function approximation on data defined manifolds
A direct approach for function approximation on data defined manifolds
H. Mhaskar
74
1
0
01 Aug 2019
Convolutional Neural Networks on Randomized Data
Convolutional Neural Networks on Randomized Data
Cristian Ivan
84
14
0
25 Jul 2019
A Group-Theoretic Framework for Data Augmentation
A Group-Theoretic Framework for Data Augmentation
Shuxiao Chen
Edgar Dobriban
Jane Lee
FedML
207
38
0
25 Jul 2019
Post-synaptic potential regularization has potential
Post-synaptic potential regularization has potentialInternational Conference on Artificial Neural Networks (ICANN), 2019
Enzo Tartaglione
Daniele Perlo
Marco Grangetto
BDLAAML
116
6
0
19 Jul 2019
Privileged Features Distillation at Taobao Recommendations
Privileged Features Distillation at Taobao RecommendationsKnowledge Discovery and Data Mining (KDD), 2019
Chen Xu
Quan Li
Junfeng Ge
Jinyang Gao
Xiaoyong Yang
Changhua Pei
Fei Sun
Jian Wu
Hanxiao Sun
Wenwu Ou
162
73
0
11 Jul 2019
Parameterized quantum circuits as machine learning models
Parameterized quantum circuits as machine learning modelsQuantum Science and Technology (QST), 2019
Marcello Benedetti
Erika Lloyd
Stefan H. Sack
Mattia Fiorentini
432
1,061
0
18 Jun 2019
Interpretations of Deep Learning by Forests and Haar Wavelets
Interpretations of Deep Learning by Forests and Haar Wavelets
Changcun Huang
FAtt
127
0
0
16 Jun 2019
Is Deep Learning a Renormalization Group Flow?
Is Deep Learning a Renormalization Group Flow?
E. Koch
R. Koch
Ling Cheng
OODAI4CE
141
5
0
12 Jun 2019
When and Why Metaheuristics Researchers Can Ignore "No Free Lunch"
  Theorems
When and Why Metaheuristics Researchers Can Ignore "No Free Lunch" Theorems
James McDermott
FedML
98
17
0
07 Jun 2019
Deep ReLU Networks Have Surprisingly Few Activation Patterns
Deep ReLU Networks Have Surprisingly Few Activation PatternsNeural Information Processing Systems (NeurIPS), 2019
Boris Hanin
David Rolnick
386
247
0
03 Jun 2019
Partial Scanning Transmission Electron Microscopy with Deep Learning
Partial Scanning Transmission Electron Microscopy with Deep Learning
Jeffrey M. Ede
R. Beanland
MedIm
245
22
0
31 May 2019
Provably scale-covariant continuous hierarchical networks based on
  scale-normalized differential expressions coupled in cascade
Provably scale-covariant continuous hierarchical networks based on scale-normalized differential expressions coupled in cascadeJournal of Mathematical Imaging and Vision (JMIV), 2019
T. Lindeberg
232
21
0
29 May 2019
AI Feynman: a Physics-Inspired Method for Symbolic Regression
AI Feynman: a Physics-Inspired Method for Symbolic RegressionScience Advances (Sci Adv), 2019
S. Udrescu
Max Tegmark
423
1,056
0
27 May 2019
On the descriptive power of Neural-Networks as constrained Tensor
  Networks with exponentially large bond dimension
On the descriptive power of Neural-Networks as constrained Tensor Networks with exponentially large bond dimensionSciPost Physics Core (SPC), 2019
M. Collura
L. Dell’Anna
Timo Felser
S. Montangero
154
21
0
27 May 2019
A Selective Overview of Deep Learning
A Selective Overview of Deep Learning
Jianqing Fan
Cong Ma
Yiqiao Zhong
BDLVLM
367
145
0
10 Apr 2019
On functions computed on trees
On functions computed on trees
Roozbeh Farhoodi
Khashayar Filom
I. Jones
Konrad Paul Kording
PINN
164
5
0
04 Apr 2019
Deep Neural Networks for Rotation-Invariance Approximation and Learning
Deep Neural Networks for Rotation-Invariance Approximation and Learning
C. Chui
Shao-Bo Lin
Ding-Xuan Zhou
218
36
0
03 Apr 2019
Generative Tensor Network Classification Model for Supervised Machine
  Learning
Generative Tensor Network Classification Model for Supervised Machine Learning
Zheng-Zhi Sun
C. Peng
Ding Liu
Shi-Ju Ran
G. Su
94
48
0
26 Mar 2019
ToyArchitecture: Unsupervised Learning of Interpretable Models of the
  World
ToyArchitecture: Unsupervised Learning of Interpretable Models of the World
Jaroslav Vítků
Petr Dluhos
Joseph Davidson
Matej Nikl
Simon Andersson
...
Martin Stránský
M. Hyben
Martin Poliak
Jan Feyereisl
Marek Rosa
AI4CESSL
234
0
0
20 Mar 2019
How to Make Swarms Open-Ended? Evolving Collective Intelligence Through
  a Constricted Exploration of Adjacent Possibles
How to Make Swarms Open-Ended? Evolving Collective Intelligence Through a Constricted Exploration of Adjacent PossiblesArtificial Life (ALIFE), 2019
Olaf Witkowski
T. Ikegami
101
8
0
19 Mar 2019
Previous
12345
Next