| ChuXin: 1.6B Technical Report | May 8, 2024 | Continual PretrainingLanguage Modeling | —Unverified | 0 |
| Pretraining and Updates of Domain-Specific LLM: A Case Study in the Japanese Business Domain | Apr 12, 2024 | Continual PretrainingGeneral Knowledge | —Unverified | 0 |
| CEM: A Data-Efficient Method for Large Language Models to Continue Evolving From Mistakes | Apr 11, 2024 | Continual LearningContinual Pretraining | —Unverified | 0 |
| Aurora-M: Open Source Continual Pre-training for Multilingual Language and Code | Mar 30, 2024 | Continual PretrainingLanguage Modelling | —Unverified | 0 |
| PARAMANU-AYN: Pretrain from scratch or Continual Pretraining of LLMs for Legal Domain Adaptation? | Mar 20, 2024 | Abstractive Text SummarizationContinual Pretraining | —Unverified | 0 |
| Investigating Continual Pretraining in Large Language Models: Insights and Implications | Feb 27, 2024 | Continual LearningContinual Pretraining | —Unverified | 0 |
| Continual Learning for Large Language Models: A Survey | Feb 2, 2024 | Continual LearningContinual Pretraining | —Unverified | 0 |
| RomanSetu: Efficiently unlocking multilingual capabilities of Large Language Models via Romanization | Jan 25, 2024 | Continual PretrainingSentiment Analysis | CodeCode Available | 0 |
| PECoP: Parameter Efficient Continual Pretraining for Action Quality Assessment | Nov 11, 2023 | Action Quality AssessmentContinual Pretraining | CodeCode Available | 0 |
| AF Adapter: Continual Pretraining for Building Chinese Biomedical Language Model | Nov 21, 2022 | Continual PretrainingLanguage Modeling | CodeCode Available | 0 |