SOTAVerified

DeInfoReg: A Decoupled Learning Framework for Better Training Throughput

2025-06-22Code Available0· sign in to hype

Zih-Hao Huang, You-Teng Lin, Hung-Hsuan Chen

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

This paper introduces Decoupled Supervised Learning with Information Regularization (DeInfoReg), a novel approach that transforms a long gradient flow into multiple shorter ones, thereby mitigating the vanishing gradient problem. Integrating a pipeline strategy, DeInfoReg enables model parallelization across multiple GPUs, significantly improving training throughput. We compare our proposed method with standard backpropagation and other gradient flow decomposition techniques. Extensive experiments on diverse tasks and datasets demonstrate that DeInfoReg achieves superior performance and better noise resistance than traditional BP models and efficiently utilizes parallel computing resources. The code for reproducibility is available at: https://github.com/ianzih/Decoupled-Supervised-Learning-for-Information-Regularization/.

Reproductions