ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.16932
21
1
v1v2 (latest)

The Polar Express: Optimal Matrix Sign Methods and Their Application to the Muon Algorithm

22 May 2025
Noah Amsel
David Persson
Christopher Musco
Robert Gower
ArXiv (abs)PDFHTML
Main:17 Pages
7 Figures
Bibliography:5 Pages
Appendix:12 Pages
Abstract

Computing the polar decomposition and the related matrix sign function, has been a well-studied problem in numerical analysis for decades. More recently, it has emerged as an important subroutine in deep learning, particularly within the Muon optimization framework. However, the requirements in this setting differ significantly from those of traditional numerical analysis. In deep learning, methods must be highly efficient and GPU-compatible, but high accuracy is often unnecessary. As a result, classical algorithms like Newton-Schulz (which suffers from slow initial convergence) and methods based on rational functions (which rely on QR decompositions or matrix inverses) are poorly suited to this context. In this work, we introduce Polar Express, a GPU-friendly algorithm for computing the polar decomposition. Like classical polynomial methods such as Newton-Schulz, our approach uses only matrix-matrix multiplications, making it GPU-compatible. Motivated by earlier work of Chen & Chow and Nakatsukasa & Freund, Polar Express adapts the polynomial update rule at each iteration by solving a minimax optimization problem, and we prove that it enjoys a strong worst-case optimality guarantee. This property ensures both rapid early convergence and fast asymptotic convergence. We also address finite-precision issues, making it stable in bfloat16 in practice. We apply Polar Express within the Muon optimization framework and show consistent improvements in validation loss on large-scale models such as GPT-2, outperforming recent alternatives across a range of learning rates.

View on arXiv
@article{amsel2025_2505.16932,
  title={ The Polar Express: Optimal Matrix Sign Methods and Their Application to the Muon Algorithm },
  author={ Noah Amsel and David Persson and Christopher Musco and Robert M. Gower },
  journal={arXiv preprint arXiv:2505.16932},
  year={ 2025 }
}
Comments on this paper