138
v1v2v3 (latest)

FlashNorm: fast normalization for LLMs

Main:8 Pages
14 Figures
Bibliography:2 Pages
Abstract

This paper presents FlashNorm, which is an exact but faster implementation of RMSNorm followed by linear layers. RMSNorm is used by many LLMs such as Llama, Mistral, and OpenELM. FlashNorm also speeds up Layer Normalization and its recently proposed replacement Dynamic Tanh (DyT)arXiv:2503.10622. FlashNorm also reduces the number of parameter tensors by simply merging the normalization weights with the weights of the next linear layer. Seethis https URLfor code and more transformer tricks.

View on arXiv
Comments on this paper