SOTAVerified

Continual Pretraining

Papers

Showing 2130 of 70 papers

TitleStatusHype
On the Robustness of Reading Comprehension Models to Entity RenamingCode1
Efficient Contrastive Learning via Novel Data Augmentation and Curriculum LearningCode1
ECONET: Effective Continual Pretraining of Language Models for Event Temporal ReasoningCode1
Biomed-Enriched: A Biomedical Dataset Enriched with LLMs for Pretraining and Extracting Rare and Hidden Content0
LLaVA-c: Continual Improved Visual Instruction Tuning0
Simulating Training Data Leakage in Multiple-Choice Benchmarks for LLM EvaluationCode0
A Japanese Language Model and Three New Evaluation Benchmarks for Pharmaceutical NLPCode0
Enhance Mobile Agents Thinking Process Via Iterative Preference Learning0
Mining Hidden Thoughts from Texts: Evaluating Continual Pretraining with Synthetic Data for LLM Reasoning0
Efficient Domain-adaptive Continual Pretraining for the Process Industry in the German Language0
Show:102550
← PrevPage 3 of 7Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1DASF1 (macro)0.69Unverified
#ModelMetricClaimedVerifiedStatus
1CPTF1 - macro63.77Unverified
#ModelMetricClaimedVerifiedStatus
1DASF1 (macro)0.71Unverified