30
0

On the Spatial Structure of Mixture-of-Experts in Transformers

Abstract

A common assumption is that MoE routers primarily leverage semantic features for expert selection. However, our study challenges this notion by demonstrating that positional token information also plays a crucial role in routing decisions. Through extensive empirical analysis, we provide evidence supporting this hypothesis, develop a phenomenological explanation of the observed behavior, and discuss practical implications for MoE-based architectures.

View on arXiv
@article{bershatsky2025_2504.04444,
  title={ On the Spatial Structure of Mixture-of-Experts in Transformers },
  author={ Daniel Bershatsky and Ivan Oseledets },
  journal={arXiv preprint arXiv:2504.04444},
  year={ 2025 }
}
Comments on this paper