SOTAVerified

Stochastic Markov Gradient Descent and Training Low-Bit Neural Networks

2020-08-25Unverified0· sign in to hype

Jonathan Ashbrock, Alexander M. Powell

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

The massive size of modern neural networks has motivated substantial recent interest in neural network quantization. We introduce Stochastic Markov Gradient Descent (SMGD), a discrete optimization method applicable to training quantized neural networks. The SMGD algorithm is designed for settings where memory is highly constrained during training. We provide theoretical guarantees of algorithm performance as well as encouraging numerical results.

Tasks

Reproductions