ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2108.01110
  4. Cited By
Batch Normalization Preconditioning for Neural Network Training
v1v2 (latest)

Batch Normalization Preconditioning for Neural Network Training

2 August 2021
Susanna Lange
Kyle E. Helfrich
Qiang Ye
ArXiv (abs)PDFHTML

Papers citing "Batch Normalization Preconditioning for Neural Network Training"

4 / 4 papers shown
Title
Designing Preconditioners for SGD: Local Conditioning, Noise Floors, and Basin Stability
Designing Preconditioners for SGD: Local Conditioning, Noise Floors, and Basin Stability
Mitchell Scott
Tianshi Xu
Z. Tang
Alexandra Pichette-Emmons
Qiang Ye
Y. Saad
Yuanzhe Xi
AI4CE
193
0
0
24 Nov 2025
Preconditioning for Accelerated Gradient Descent Optimization and
  Regularization
Preconditioning for Accelerated Gradient Descent Optimization and Regularization
Qiang Ye
AI4CE
97
0
0
30 Sep 2024
fKAN: Fractional Kolmogorov-Arnold Networks with trainable Jacobi basis
  functions
fKAN: Fractional Kolmogorov-Arnold Networks with trainable Jacobi basis functions
Alireza Afzal Aghaei
145
75
0
11 Jun 2024
Understanding the Generalization Benefit of Normalization Layers:
  Sharpness Reduction
Understanding the Generalization Benefit of Normalization Layers: Sharpness ReductionNeural Information Processing Systems (NeurIPS), 2022
Kaifeng Lyu
Zhiyuan Li
Sanjeev Arora
FAtt
231
86
0
14 Jun 2022
1