SOTAVerified

HPTQ: Hardware-Friendly Post Training Quantization

2021-09-19Code Available1· sign in to hype

Hai Victor Habi, Reuven Peretz, Elad Cohen, Lior Dikstein, Oranit Dror, Idit Diamant, Roy H. Jennings, Arnon Netzer

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Neural network quantization enables the deployment of models on edge devices. An essential requirement for their hardware efficiency is that the quantizers are hardware-friendly: uniform, symmetric, and with power-of-two thresholds. To the best of our knowledge, current post-training quantization methods do not support all of these constraints simultaneously. In this work, we introduce a hardware-friendly post training quantization (HPTQ) framework, which addresses this problem by synergistically combining several known quantization methods. We perform a large-scale study on four tasks: classification, object detection, semantic segmentation and pose estimation over a wide variety of network architectures. Our extensive experiments show that competitive results can be obtained under hardware-friendly constraints.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
COCO (Common Objects in Context)SSD ResNet50 V1 FPN 640x640MAP34.3Unverified
ImageNetXception W8A8Top-1 Accuracy (%)78.97Unverified
ImageNetEfficientNet-B0 ReLU W8A8Top-1 Accuracy (%)77.09Unverified
ImageNetEfficientNet-B0 W8A8Top-1 Accuracy (%)74.22Unverified
ImageNetDenseNet-121 W8A8Top-1 Accuracy (%)73.36Unverified
ImageNetMobileNetV2 W8A8Top-1 Accuracy (%)71.46Unverified

Reproductions