Elastic Weight Consolidation for Full-Parameter Continual Pre-Training of Gemma2

Abstract
This technical report describes an experiment on autoregressive pre-training of Gemma2 2 billion parameter large language model (LLM) with 10\% on the Lithuanian language component of CulturaX from the point of view of continual learning. We apply elastic weight consolidation (EWC) to the full set of the model's parameters and investigate language understanding benchmarks, consisting of Arc, Belebele, Gsm8K, Hellaswag, MMLU, TruthfulQA, and Winogrande sets (both in English and Lithuanian versions), and perplexity benchmarks. We empirically demonstrate that EWC regularisation allows us not only to mitigate catastrophic forgetting effects but also that it is potentially beneficial for learning of the new task with LLMs.
View on arXiv@article{šliogeris2025_2505.05946, title={ Elastic Weight Consolidation for Full-Parameter Continual Pre-Training of Gemma2 }, author={ Vytenis Šliogeris and Povilas Daniušis and Artūras Nakvosas }, journal={arXiv preprint arXiv:2505.05946}, year={ 2025 } }
Comments on this paper