ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2402.06184
  4. Cited By
The boundary of neural network trainability is fractal

The boundary of neural network trainability is fractal

9 February 2024
Jascha Narain Sohl-Dickstein
ArXivPDFHTML

Papers citing "The boundary of neural network trainability is fractal"

5 / 5 papers shown
Title
FOCUS: First Order Concentrated Updating Scheme
FOCUS: First Order Concentrated Updating Scheme
Yizhou Liu
Ziming Liu
Jeff Gore
ODL
104
0
0
21 Jan 2025
Mapping the Edge of Chaos: Fractal-Like Boundaries in The Trainability of Decoder-Only Transformer Models
Mapping the Edge of Chaos: Fractal-Like Boundaries in The Trainability of Decoder-Only Transformer Models
Bahman Torkamandi
AI4CE
35
0
0
08 Jan 2025
A spring-block theory of feature learning in deep neural networks
A spring-block theory of feature learning in deep neural networks
Chengzhi Shi
Liming Pan
Ivan Dokmanić
AI4CE
33
1
0
28 Jul 2024
Robustness of Algorithms for Causal Structure Learning to Hyperparameter
  Choice
Robustness of Algorithms for Causal Structure Learning to Hyperparameter Choice
Damian Machlanski
Spyridon Samothrakis
Paul Clarke
CML
17
1
0
27 Oct 2023
From Stability to Chaos: Analyzing Gradient Descent Dynamics in
  Quadratic Regression
From Stability to Chaos: Analyzing Gradient Descent Dynamics in Quadratic Regression
Xuxing Chen
Krishnakumar Balasubramanian
Promit Ghosal
Bhavya Agrawalla
28
7
0
02 Oct 2023
1