ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1602.07320
  4. Cited By
Stuck in a What? Adventures in Weight Space

Stuck in a What? Adventures in Weight Space

23 February 2016
Zachary Chase Lipton
ArXiv (abs)PDFHTML

Papers citing "Stuck in a What? Adventures in Weight Space"

10 / 10 papers shown
Perspective: A Phase Diagram for Deep Learning unifying Jamming, Feature
  Learning and Lazy Training
Perspective: A Phase Diagram for Deep Learning unifying Jamming, Feature Learning and Lazy Training
Mario Geiger
Leonardo Petrini
Matthieu Wyart
DRL
211
12
0
30 Dec 2020
Classifying the classifier: dissecting the weight space of neural
  networks
Classifying the classifier: dissecting the weight space of neural networksEuropean Conference on Artificial Intelligence (ECAI), 2020
Gabriel Eilertsen
Daniel Jonsson
Timo Ropinski
Jonas Unger
Anders Ynnerman
245
64
0
13 Feb 2020
Luck Matters: Understanding Training Dynamics of Deep ReLU Networks
Luck Matters: Understanding Training Dynamics of Deep ReLU Networks
Yuandong Tian
Tina Jiang
Qucheng Gong
Ari S. Morcos
433
28
0
31 May 2019
Scaling description of generalization with number of parameters in deep
  learning
Scaling description of generalization with number of parameters in deep learning
Mario Geiger
Arthur Jacot
S. Spigler
Franck Gabriel
Levent Sagun
Stéphane dÁscoli
Giulio Biroli
Clément Hongler
Matthieu Wyart
453
208
0
06 Jan 2019
A jamming transition from under- to over-parametrization affects loss
  landscape and generalization
A jamming transition from under- to over-parametrization affects loss landscape and generalization
S. Spigler
Mario Geiger
Stéphane dÁscoli
Levent Sagun
Giulio Biroli
Matthieu Wyart
468
161
0
22 Oct 2018
The jamming transition as a paradigm to understand the loss landscape of
  deep neural networks
The jamming transition as a paradigm to understand the loss landscape of deep neural networksPhysical Review E (PRE), 2018
Mario Geiger
S. Spigler
Stéphane dÁscoli
Levent Sagun
Carlo Albert
Giulio Biroli
Matthieu Wyart
473
153
0
25 Sep 2018
PCA of high dimensional random walks with comparison to neural network
  training
PCA of high dimensional random walks with comparison to neural network training
J. Antognini
Jascha Narain Sohl-Dickstein
OOD
155
29
0
22 Jun 2018
Comparing Dynamics: Deep Neural Networks versus Glassy Systems
Comparing Dynamics: Deep Neural Networks versus Glassy Systems
Carlo Albert
Levent Sagun
Mario Geiger
S. Spigler
Gerard Ben Arous
C. Cammarota
Yann LeCun
Matthieu Wyart
Giulio Biroli
AI4CE
454
127
0
19 Mar 2018
Visualizing the Loss Landscape of Neural Nets
Visualizing the Loss Landscape of Neural NetsNeural Information Processing Systems (NeurIPS), 2017
Hao Li
Zheng Xu
Gavin Taylor
Christoph Studer
Tom Goldstein
836
2,247
0
28 Dec 2017
Distributed Bayesian Learning with Stochastic Natural-gradient
  Expectation Propagation and the Posterior Server
Distributed Bayesian Learning with Stochastic Natural-gradient Expectation Propagation and the Posterior Server
Leonard Hasenclever
Stefan Webb
Thibaut Lienart
Sebastian J. Vollmer
Balaji Lakshminarayanan
Charles Blundell
Yee Whye Teh
BDL
498
73
0
31 Dec 2015
1
Page 1 of 1