ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1910.05929
  4. Cited By
Emergent properties of the local geometry of neural loss landscapes

Emergent properties of the local geometry of neural loss landscapes

14 October 2019
Stanislav Fort
Surya Ganguli
ArXivPDFHTML

Papers citing "Emergent properties of the local geometry of neural loss landscapes"

14 / 14 papers shown
Title
Unraveling the Hessian: A Key to Smooth Convergence in Loss Function
  Landscapes
Unraveling the Hessian: A Key to Smooth Convergence in Loss Function Landscapes
Nikita Kiselev
Andrey Grabovoy
54
1
0
18 Sep 2024
Data Shapley in One Training Run
Data Shapley in One Training Run
Jiachen T. Wang
Prateek Mittal
Dawn Song
Ruoxi Jia
TDI
39
7
0
16 Jun 2024
Agnostic Sharpness-Aware Minimization
Agnostic Sharpness-Aware Minimization
Van-Anh Nguyen
Quyen Tran
Tuan Truong
Thanh-Toan Do
Dinh Q. Phung
Trung Le
46
0
0
11 Jun 2024
Directions of Curvature as an Explanation for Loss of Plasticity
Directions of Curvature as an Explanation for Loss of Plasticity
Alex Lewandowski
Haruto Tanaka
Dale Schuurmans
Marlos C. Machado
13
5
0
30 Nov 2023
On the Power-Law Hessian Spectrums in Deep Learning
On the Power-Law Hessian Spectrums in Deep Learning
Zeke Xie
Qian-Yuan Tang
Yunfeng Cai
Mingming Sun
P. Li
ODL
42
9
0
31 Jan 2022
Does the Data Induce Capacity Control in Deep Learning?
Does the Data Induce Capacity Control in Deep Learning?
Rubing Yang
J. Mao
Pratik Chaudhari
30
15
0
27 Oct 2021
On the Impact of Stable Ranks in Deep Nets
On the Impact of Stable Ranks in Deep Nets
B. Georgiev
L. Franken
Mayukh Mukherjee
Georgios Arvanitidis
15
3
0
05 Oct 2021
Consensus Control for Decentralized Deep Learning
Consensus Control for Decentralized Deep Learning
Lingjing Kong
Tao R. Lin
Anastasia Koloskova
Martin Jaggi
Sebastian U. Stich
19
75
0
09 Feb 2021
Chaos and Complexity from Quantum Neural Network: A study with Diffusion
  Metric in Machine Learning
Chaos and Complexity from Quantum Neural Network: A study with Diffusion Metric in Machine Learning
S. Choudhury
Ankan Dutta
Debisree Ray
22
21
0
16 Nov 2020
Improving Neural Network Training in Low Dimensional Random Bases
Improving Neural Network Training in Low Dimensional Random Bases
Frithjof Gressmann
Zach Eaton-Rosen
Carlo Luschi
22
28
0
09 Nov 2020
Linear Mode Connectivity in Multitask and Continual Learning
Linear Mode Connectivity in Multitask and Continual Learning
Seyed Iman Mirzadeh
Mehrdad Farajtabar
Dilan Görür
Razvan Pascanu
H. Ghasemzadeh
CLL
29
138
0
09 Oct 2020
The Break-Even Point on Optimization Trajectories of Deep Neural
  Networks
The Break-Even Point on Optimization Trajectories of Deep Neural Networks
Stanislaw Jastrzebski
Maciej Szymczak
Stanislav Fort
Devansh Arpit
Jacek Tabor
Kyunghyun Cho
Krzysztof J. Geras
47
154
0
21 Feb 2020
Stiffness: A New Perspective on Generalization in Neural Networks
Stiffness: A New Perspective on Generalization in Neural Networks
Stanislav Fort
Pawel Krzysztof Nowak
Stanislaw Jastrzebski
S. Narayanan
19
94
0
28 Jan 2019
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp
  Minima
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
284
2,889
0
15 Sep 2016
1