SOTAVerified

Xmodel-2 Technical Report

2024-12-27Code Available0· sign in to hype

Wang Qun, Liu Yang, Lin Qingquan, Qu Zhijiu, Jiang Ling

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Xmodel-2 is a 1.2-billion-parameter large language model designed specifically for reasoning tasks. Its architecture enables different model scales to share a unified set of hyperparameters, allowing for extensive experimentation on smaller models and seamless transfer of optimal configurations to larger models. To maximize training efficiency and stability, Xmodel-2 employs the WSD learning rate scheduler from MiniCPM. Pretrained on 1.5 trillion tokens from diverse sources, Xmodel-2 achieves state-of-the-art performance in complex reasoning and agent-based tasks, while maintaining low training costs. These results highlight the potential of efficient model design and training strategies in advancing reasoning capabilities. Model checkpoints and code are publicly available on GitHub at https://github.com/XiaoduoAILab/Xmodel-2

Tasks

Reproductions