389
v1v2 (latest)

Clustering in Causal Attention Masking

Neural Information Processing Systems (NeurIPS), 2024
Main:8 Pages
6 Figures
Bibliography:3 Pages
1 Tables
Appendix:11 Pages
Abstract

This work presents a modification of the self-attention dynamics proposed by Geshkovski et al. (arXiv:2312.10794) to better reflect the practically relevant, causally masked attention used in transformer architectures for generative AI. This modification translates into an interacting particle system that cannot be interpreted as a mean-field gradient flow. Despite this loss of structure, we significantly strengthen the results of Geshkovski et al. (arXiv:2312.10794) in this context: While previous rigorous results focused on cases where all three matrices (Key, Query, and Value) were scaled identities, we prove asymptotic convergence to a single cluster for arbitrary key-query matrices and a value matrix equal to the identity. Additionally, we establish a connection to the classical R\ényi parking problem from combinatorial geometry to make initial theoretical steps towards demonstrating the existence of meta-stable states.

View on arXiv
Comments on this paper