SOTAVerified

Continual Pretraining

Papers

Showing 5160 of 70 papers

TitleStatusHype
ChuXin: 1.6B Technical Report0
Pretraining and Updates of Domain-Specific LLM: A Case Study in the Japanese Business Domain0
CEM: A Data-Efficient Method for Large Language Models to Continue Evolving From Mistakes0
Aurora-M: Open Source Continual Pre-training for Multilingual Language and Code0
PARAMANU-AYN: Pretrain from scratch or Continual Pretraining of LLMs for Legal Domain Adaptation?0
Investigating Continual Pretraining in Large Language Models: Insights and Implications0
Continual Learning for Large Language Models: A Survey0
RomanSetu: Efficiently unlocking multilingual capabilities of Large Language Models via RomanizationCode0
PECoP: Parameter Efficient Continual Pretraining for Action Quality AssessmentCode0
AF Adapter: Continual Pretraining for Building Chinese Biomedical Language ModelCode0
Show:102550
← PrevPage 6 of 7Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1DASF1 (macro)0.69Unverified
#ModelMetricClaimedVerifiedStatus
1CPTF1 - macro63.77Unverified
#ModelMetricClaimedVerifiedStatus
1DASF1 (macro)0.71Unverified