SOTAVerified

Continual Pretraining

Papers

Showing 3140 of 70 papers

TitleStatusHype
Aurora-M: Open Source Continual Pre-training for Multilingual Language and Code0
DD-TIG at Constraint@ACL2022: Multimodal Understanding and Reasoning for Role Labeling of Entities in Hateful Memes0
DoPAMine: Domain-specific Pre-training Adaptation from seed-guided data Mining0
Efficient Domain-adaptive Continual Pretraining for the Process Industry in the German Language0
Enhance Mobile Agents Thinking Process Via Iterative Preference Learning0
On the Robustness of Reading Comprehension Models to Entity Renaming0
Open Generative Large Language Models for Galician0
Overcoming Vocabulary Mismatch: Vocabulary-agnostic Teacher Guided Language Modeling0
PARAMANU-AYN: Pretrain from scratch or Continual Pretraining of LLMs for Legal Domain Adaptation?0
Pretraining and Updates of Domain-Specific LLM: A Case Study in the Japanese Business Domain0
Show:102550
← PrevPage 4 of 7Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1DASF1 (macro)0.69Unverified
#ModelMetricClaimedVerifiedStatus
1CPTF1 - macro63.77Unverified
#ModelMetricClaimedVerifiedStatus
1DASF1 (macro)0.71Unverified