SOTAVerified

Elastic Weight Consolidation for Full-Parameter Continual Pre-Training of Gemma2

2025-05-09Unverified0· sign in to hype

Vytenis Šliogeris, Povilas Daniušis, Artūras Nakvosas

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

This technical report describes an experiment on autoregressive pre-training of Gemma2 2 billion parameter large language model (LLM) with 10\% on the Lithuanian language component of CulturaX from the point of view of continual learning. We apply elastic weight consolidation (EWC) to the full set of the model's parameters and investigate language understanding benchmarks, consisting of Arc, Belebele, Gsm8K, Hellaswag, MMLU, TruthfulQA, and Winogrande sets (both in English and Lithuanian versions), and perplexity benchmarks. We empirically demonstrate that EWC regularisation allows us not only to mitigate catastrophic forgetting effects but also that it is potentially beneficial for learning of the new task with LLMs.

Tasks

Reproductions