ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2402.09142
  4. Cited By
When Representations Align: Universality in Representation Learning
  Dynamics

When Representations Align: Universality in Representation Learning Dynamics

14 February 2024
Loek van Rossem
Andrew M. Saxe
    AI4CE
ArXivPDFHTML

Papers citing "When Representations Align: Universality in Representation Learning Dynamics"

5 / 5 papers shown
Title
Formation of Representations in Neural Networks
Formation of Representations in Neural Networks
Liu Ziyin
Isaac Chuang
Tomer Galanti
T. Poggio
34
4
0
03 Oct 2024
Clustering and Alignment: Understanding the Training Dynamics in Modular
  Addition
Clustering and Alignment: Understanding the Training Dynamics in Modular Addition
Tiberiu Musat
24
1
0
18 Aug 2024
Survival of the Fittest Representation: A Case Study with Modular
  Addition
Survival of the Fittest Representation: A Case Study with Modular Addition
Xiaoman Delores Ding
Zifan Carl Guo
Eric J. Michaud
Ziming Liu
Max Tegmark
29
3
0
27 May 2024
Scaling Laws for Neural Language Models
Scaling Laws for Neural Language Models
Jared Kaplan
Sam McCandlish
T. Henighan
Tom B. Brown
B. Chess
R. Child
Scott Gray
Alec Radford
Jeff Wu
Dario Amodei
226
4,424
0
23 Jan 2020
Dynamical Isometry and a Mean Field Theory of CNNs: How to Train
  10,000-Layer Vanilla Convolutional Neural Networks
Dynamical Isometry and a Mean Field Theory of CNNs: How to Train 10,000-Layer Vanilla Convolutional Neural Networks
Lechao Xiao
Yasaman Bahri
Jascha Narain Sohl-Dickstein
S. Schoenholz
Jeffrey Pennington
220
347
0
14 Jun 2018
1