| Hierarchical Label-wise Attention Transformer Model for Explainable ICD Coding | Apr 22, 2022 | Continual Pretraining | CodeCode Available | 0 | 5 |
| Unsupervised Domain Adaptation for Sparse Retrieval by Filling Vocabulary and Word Frequency Gaps | Nov 8, 2022 | Continual PretrainingDomain Adaptation | CodeCode Available | 0 | 5 |
| PECoP: Parameter Efficient Continual Pretraining for Action Quality Assessment | Nov 11, 2023 | Action Quality AssessmentContinual Pretraining | CodeCode Available | 0 | 5 |
| AF Adapter: Continual Pretraining for Building Chinese Biomedical Language Model | Nov 21, 2022 | Continual PretrainingLanguage Modeling | CodeCode Available | 0 | 5 |
| LangSAMP: Language-Script Aware Multilingual Pretraining | Sep 26, 2024 | Continual PretrainingLanguage Modeling | CodeCode Available | 0 | 5 |
| Alchemy: Amplifying Theorem-Proving Capability through Symbolic Mutation | Oct 21, 2024 | Automated Theorem ProvingContinual Pretraining | CodeCode Available | 0 | 5 |
| PARAMANU-AYN: Pretrain from scratch or Continual Pretraining of LLMs for Legal Domain Adaptation? | Mar 20, 2024 | Abstractive Text SummarizationContinual Pretraining | —Unverified | 0 | 0 |
| Pretraining and Updates of Domain-Specific LLM: A Case Study in the Japanese Business Domain | Apr 12, 2024 | Continual PretrainingGeneral Knowledge | —Unverified | 0 | 0 |
| RedWhale: An Adapted Korean LLM Through Efficient Continual Pretraining | Aug 21, 2024 | Continual PretrainingCross-Lingual Transfer | —Unverified | 0 | 0 |
| Revisiting Pretraining with Adapters | Aug 1, 2021 | Continual PretrainingTransfer Learning | —Unverified | 0 | 0 |