11

BERT-JEPA: Reorganizing CLS Embeddings for Language-Invariant Semantics

Taj Gillin
Adam Lalani
Kenneth Zhang
Marcel Mateos Salles
Main:4 Pages
11 Figures
Bibliography:3 Pages
10 Tables
Appendix:9 Pages
Abstract

Joint Embedding Predictive Architectures (JEPA) are a novel self supervised training technique that have shown recent promise across domains. We introduce BERT-JEPA (BEPA), a training paradigm that adds a JEPA training objective to BERT-style models, working to combat a collapsed [CLS] embedding space and turning it into a language-agnostic space. This new structure leads to increased performance across multilingual benchmarks.

View on arXiv
Comments on this paper