SOTAVerified

Learning representations by forward-propagating errors

2023-08-17Unverified0· sign in to hype

Ryoungwoo Jang

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Back-propagation (BP) is widely used learning algorithm for neural network optimization. However, BP requires enormous computation cost and is too slow to train in central processing unit (CPU). Therefore current neural network optimizaiton is performed in graphical processing unit (GPU) with compute unified device architecture (CUDA) programming. In this paper, we propose a light, fast learning algorithm on CPU that is fast as CUDA acceleration on GPU. This algorithm is based on forward-propagating method, using concept of dual number in algebraic geometry.

Tasks

Reproductions