Mitigating the Diminishing Effect of Elastic Weight Consolidation
2022-10-01COLING 2022Unverified0· sign in to hype
Canasai Kruengkrai, Junichi Yamagishi
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
Elastic weight consolidation (EWC, Kirkpatrick et al. 2017) is a promising approach to addressing catastrophic forgetting in sequential training. We find that the effect of EWC can diminish when fine-tuning large-scale pre-trained language models on different datasets. We present two simple objective functions to mitigate this problem by rescaling the components of EWC. Experiments on natural language inference and fact-checking tasks indicate that our methods require much smaller values for the trade-off parameters to achieve results comparable to EWC.