ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2309.04644
  4. Cited By
Towards Understanding Neural Collapse: The Effects of Batch
  Normalization and Weight Decay
v1v2 (latest)

Towards Understanding Neural Collapse: The Effects of Batch Normalization and Weight Decay

9 September 2023
Leyan Pan
Xinyuan Cao
ArXiv (abs)PDFHTMLGithub

Papers citing "Towards Understanding Neural Collapse: The Effects of Batch Normalization and Weight Decay"

5 / 5 papers shown
Cautious Weight Decay
Cautious Weight Decay
Lizhang Chen
Jonathan Li
Kaizhao Liang
Baiyu Su
Cong Xie
Nuo Wang Pierse
Chen Liang
Ni Lao
Qiang Liu
185
6
0
14 Oct 2025
TRUST: Test-time Resource Utilization for Superior Trustworthiness
TRUST: Test-time Resource Utilization for Superior Trustworthiness
Haripriya Harikumar
Santu Rana
246
0
0
06 Jun 2025
The Fair Language Model Paradox
The Fair Language Model Paradox
Andrea Pinto
Tomer Galanti
Randall Balestriero
405
3
0
15 Oct 2024
Wide Neural Networks Trained with Weight Decay Provably Exhibit Neural
  Collapse
Wide Neural Networks Trained with Weight Decay Provably Exhibit Neural CollapseInternational Conference on Learning Representations (ICLR), 2024
Arthur Jacot
Peter Súkeník
Zihan Wang
Marco Mondelli
355
14
0
07 Oct 2024
Neural Collapse versus Low-rank Bias: Is Deep Neural Collapse Really
  Optimal?
Neural Collapse versus Low-rank Bias: Is Deep Neural Collapse Really Optimal?
Peter Súkeník
Marco Mondelli
Christoph H. Lampert
AI4CE
329
15
0
23 May 2024
1
Page 1 of 1