Enhanced QKNorm normalization for neural transformers with the Lp norm
Ezequiel Lopez-Rubio
Javier Montes-Perez
Esteban Jose Palomo
Main:5 Pages
3 Figures
Bibliography:1 Pages
Abstract
The normalization of query and key vectors is an essential part of the Transformer architecture. It ensures that learning is stable regardless of the scale of these vectors. Some normalization approaches are available. In this preliminary work, a generalization of the QKNorm normalization scheme is proposed. The approach is based on the Lp norm, allowing non-Euclidean norms to be employed. Experimental results demonstrate the suitability of the method for a simple problem.
View on arXivComments on this paper
