SOTAVerified

Continual Pretraining

Papers

Showing 4150 of 70 papers

TitleStatusHype
Aurora-M: Open Source Continual Pre-training for Multilingual Language and Code0
PARAMANU-AYN: Pretrain from scratch or Continual Pretraining of LLMs for Legal Domain Adaptation?0
Yi: Open Foundation Models by 01.AICode9
Investigating Continual Pretraining in Large Language Models: Insights and Implications0
Data Engineering for Scaling Language Models to 128K ContextCode3
Autonomous Data Selection with Zero-shot Generative Classifiers for Mathematical TextsCode2
Continual Learning for Large Language Models: A Survey0
RomanSetu: Efficiently unlocking multilingual capabilities of Large Language Models via RomanizationCode0
PECoP: Parameter Efficient Continual Pretraining for Action Quality AssessmentCode0
Effective Long-Context Scaling of Foundation ModelsCode2
Show:102550
← PrevPage 5 of 7Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1DASF1 (macro)0.69Unverified
#ModelMetricClaimedVerifiedStatus
1CPTF1 - macro63.77Unverified
#ModelMetricClaimedVerifiedStatus
1DASF1 (macro)0.71Unverified