SOTAVerified

Continual Pretraining

Papers

Showing 2130 of 70 papers

TitleStatusHype
Continual Pre-Training Mitigates Forgetting in Language and VisionCode1
Towards Geospatial Foundation Models via Continual PretrainingCode1
Multi-Label Guided Soft Contrastive Learning for Efficient Earth Observation PretrainingCode1
ChuXin: 1.6B Technical Report0
DoPAMine: Domain-specific Pre-training Adaptation from seed-guided data Mining0
AstroMLab 2: AstroLLaMA-2-70B Model and Benchmarking Specialised LLMs for Astronomy0
AfroXLMR-Social: Adapting Pre-trained Language Models for African Languages Social Media Text0
Breaking the Stage Barrier: A Novel Single-Stage Approach to Long Context Extension for Large Language Models0
Biomed-Enriched: A Biomedical Dataset Enriched with LLMs for Pretraining and Extracting Rare and Hidden Content0
DD-TIG at Constraint@ACL2022: Multimodal Understanding and Reasoning for Role Labeling of Entities in Hateful Memes0
Show:102550
← PrevPage 3 of 7Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1DASF1 (macro)0.69Unverified
#ModelMetricClaimedVerifiedStatus
1CPTF1 - macro63.77Unverified
#ModelMetricClaimedVerifiedStatus
1DASF1 (macro)0.71Unverified