ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2309.04644
  4. Cited By
Towards Understanding Neural Collapse: The Effects of Batch
  Normalization and Weight Decay

Towards Understanding Neural Collapse: The Effects of Batch Normalization and Weight Decay

9 September 2023
Leyan Pan
Xinyuan Cao
ArXivPDFHTML

Papers citing "Towards Understanding Neural Collapse: The Effects of Batch Normalization and Weight Decay"

5 / 5 papers shown
Title
The Fair Language Model Paradox
The Fair Language Model Paradox
Andrea Pinto
Tomer Galanti
Randall Balestriero
18
0
0
15 Oct 2024
Wide Neural Networks Trained with Weight Decay Provably Exhibit Neural
  Collapse
Wide Neural Networks Trained with Weight Decay Provably Exhibit Neural Collapse
Arthur Jacot
Peter Súkeník
Zihan Wang
Marco Mondelli
28
1
0
07 Oct 2024
Neural Collapse versus Low-rank Bias: Is Deep Neural Collapse Really
  Optimal?
Neural Collapse versus Low-rank Bias: Is Deep Neural Collapse Really Optimal?
Peter Súkeník
Marco Mondelli
Christoph H. Lampert
AI4CE
32
5
0
23 May 2024
Nearest Class-Center Simplification through Intermediate Layers
Nearest Class-Center Simplification through Intermediate Layers
Ido Ben-Shaul
S. Dekel
35
26
0
21 Jan 2022
An Unconstrained Layer-Peeled Perspective on Neural Collapse
An Unconstrained Layer-Peeled Perspective on Neural Collapse
Wenlong Ji
Yiping Lu
Yiliang Zhang
Zhun Deng
Weijie J. Su
122
83
0
06 Oct 2021
1