RigoBERTa: A State-of-the-Art Language Model For Spanish
Alejandro Vaca Serrano
Guillem García Subies
Helena Montoro Zamorano
Nuria Aldama García
Doaa Samy
David Betancur Sánchez
Antonio Moreno-Sandoval
Marta Guerrero Nieto
Á. Jiménez

Abstract
This paper presents RigoBERTa, a State-of-the-Art Language Model for Spanish. RigoBERTa is trained over a well-curated corpus formed up from different subcorpora with key features. It follows the DeBERTa architecture, which has several advantages over other architectures of similar size as BERT or RoBERTa. RigoBERTa performance is assessed over 13 NLU tasks in comparison with other available Spanish language models, namely, MarIA, BERTIN and BETO. RigoBERTa outperformed the three models in 10 out of the 13 tasks, achieving new "State-of-the-Art" results.
View on arXivComments on this paper