SOTAVerified

Quantization in Layer's Input is Matter

2022-02-10Unverified0· sign in to hype

Daning Cheng, WenGuang Chen

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

In this paper, we will show that the quantization in layer's input is more important than parameters' quantization for loss function. And the algorithm which is based on the layer's input quantization error is better than hessian-based mixed precision layout algorithm.

Tasks

Reproductions