SOTAVerified

Continual Pretraining

Papers

Showing 4150 of 70 papers

TitleStatusHype
Breaking the Stage Barrier: A Novel Single-Stage Approach to Long Context Extension for Large Language Models0
Revisiting Pretraining with Adapters0
AfroXLMR-Social: Adapting Pre-trained Language Models for African Languages Social Media Text0
The Construction of Instruction-tuned LLMs for Finance without Instruction Data Using Continual Pretraining and Model Merging0
AdaPrompt: Adaptive Model Training for Prompt-based NLP0
BAMBINO-LM: (Bilingual-)Human-Inspired Continual Pretraining of BabyLM0
Bilingual Adaptation of Monolingual Foundation Models0
Biomed-Enriched: A Biomedical Dataset Enriched with LLMs for Pretraining and Extracting Rare and Hidden Content0
Investigating Continual Pretraining in Large Language Models: Insights and Implications0
Is Domain Adaptation Worth Your Investment? Comparing BERT and FinBERT on Financial Tasks0
Show:102550
← PrevPage 5 of 7Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1DASF1 (macro)0.69Unverified
#ModelMetricClaimedVerifiedStatus
1CPTF1 - macro63.77Unverified
#ModelMetricClaimedVerifiedStatus
1DASF1 (macro)0.71Unverified