Efficient transfer learning for NLP with ELECTRA
2021-04-06Code Available1· sign in to hype
François Mercier
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/cccwam/rc2020_electraOfficialIn paperpytorch★ 12
Abstract
Clark et al. [2020] claims that the ELECTRA approach is highly efficient in NLP performances relative to computation budget. As such, this reproducibility study focus on this claim, summarized by the following question: Can we use ELECTRA to achieve close to SOTA performances for NLP in low-resource settings, in term of compute cost?