83
4

Harnessing the Universal Geometry of Embeddings

Main:9 Pages
7 Figures
Bibliography:4 Pages
10 Tables
Appendix:4 Pages
Abstract

We introduce the first method for translating text embeddings from one vector space to another without any paired data, encoders, or predefined sets of matches. Our unsupervised approach translates any embedding to and from a universal latent representation (i.e., a universal semantic structure conjectured by the Platonic Representation Hypothesis). Our translations achieve high cosine similarity across model pairs with different architectures, parameter counts, and training datasets.

View on arXiv
@article{jha2025_2505.12540,
  title={ Harnessing the Universal Geometry of Embeddings },
  author={ Rishi Jha and Collin Zhang and Vitaly Shmatikov and John X. Morris },
  journal={arXiv preprint arXiv:2505.12540},
  year={ 2025 }
}
Comments on this paper