SOTAVerified

Continual Pretraining

Papers

Showing 5160 of 70 papers

TitleStatusHype
CTP: Towards Vision-Language Continual Pretraining via Compatible Momentum Contrast and Topology PreservationCode1
Towards Geospatial Foundation Models via Continual PretrainingCode1
Continual Pre-training of Language ModelsCode2
CTP:Towards Vision-Language Continual Pretraining via Compatible Momentum Contrast and Topology PreservationCode1
AF Adapter: Continual Pretraining for Building Chinese Biomedical Language ModelCode0
Unsupervised Domain Adaptation for Sparse Retrieval by Filling Vocabulary and Word Frequency GapsCode0
Continual Training of Language Models for Few-Shot LearningCode2
Continual Pre-Training Mitigates Forgetting in Language and VisionCode1
DD-TIG at Constraint@ACL2022: Multimodal Understanding and Reasoning for Role Labeling of Entities in Hateful Memes0
Hierarchical Label-wise Attention Transformer Model for Explainable ICD CodingCode0
Show:102550
← PrevPage 6 of 7Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1DASF1 (macro)0.69Unverified
#ModelMetricClaimedVerifiedStatus
1CPTF1 - macro63.77Unverified
#ModelMetricClaimedVerifiedStatus
1DASF1 (macro)0.71Unverified