Flash normalization: fast normalization for LLMs

Abstract
RMSNorm is used by many LLMs such as Llama, Mistral, and OpenELM. This paper details FlashNorm, which is an exact but faster implementation of RMSNorm followed by linear layers. FlashNorm also speeds up Layer Normalization and its recently proposed replacement Dynamic Tanh (DyT)arXiv:2503.10622. Seethis https URLfor code and more transformer tricks.
View on arXiv@article{graef2025_2407.09577, title={ Flash normalization: fast normalization for LLMs }, author={ Nils Graef and Matthew Clapp and Andrew Wasielewski }, journal={arXiv preprint arXiv:2407.09577}, year={ 2025 } }
Comments on this paper