ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.09005
  4. Cited By
Analyzing Neural Scaling Laws in Two-Layer Networks with Power-Law Data
  Spectra

Analyzing Neural Scaling Laws in Two-Layer Networks with Power-Law Data Spectra

International Conference on Learning Representations (ICLR), 2024
11 October 2024
Roman Worschech
B. Rosenow
ArXiv (abs)PDFHTMLGithub

Papers citing "Analyzing Neural Scaling Laws in Two-Layer Networks with Power-Law Data Spectra"

3 / 3 papers shown
Fast Escape, Slow Convergence: Learning Dynamics of Phase Retrieval under Power-Law Data
Fast Escape, Slow Convergence: Learning Dynamics of Phase Retrieval under Power-Law Data
Guillaume Braun
Bruno Loureiro
Ha Quang Minh
Masaaki Imaizumi
164
1
0
24 Nov 2025
How Feature Learning Can Improve Neural Scaling Laws
How Feature Learning Can Improve Neural Scaling LawsInternational Conference on Learning Representations (ICLR), 2024
Blake Bordelon
Alexander B. Atanasov
Cengiz Pehlevan
559
43
0
26 Sep 2024
Resolving Discrepancies in Compute-Optimal Scaling of Language Models
Resolving Discrepancies in Compute-Optimal Scaling of Language Models
Tomer Porian
Mitchell Wortsman
J. Jitsev
Ludwig Schmidt
Y. Carmon
620
62
0
27 Jun 2024
1
Page 1 of 1