292
v1v2 (latest)

TimeLMs: Diachronic Language Models from Twitter

Annual Meeting of the Association for Computational Linguistics (ACL), 2022
Abstract

Despite its importance, the time variable has been largely neglected in the NLP and language model literature. In this paper, we present TimeLMs, a set of language models specialized on diachronic Twitter data. We show that a continual learning strategy contributes to enhancing Twitter-based language models' capacity to deal with future and out-of-distribution tweets, while making them competitive with standardized and more monolithic benchmarks. We also perform a number of qualitative analyses showing how they cope with trends and peaks in activity involving specific named entities or concept drift.

View on arXiv
Comments on this paper