SOTAVerified

EXACT: How to Train Your Accuracy

2022-05-19Code Available1· sign in to hype

Ivan Karpukhin, Stanislav Dereka, Sergey Kolesnikov

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Classification tasks are usually evaluated in terms of accuracy. However, accuracy is discontinuous and cannot be directly optimized using gradient ascent. Popular methods minimize cross-entropy, hinge loss, or other surrogate losses, which can lead to suboptimal results. In this paper, we propose a new optimization framework by introducing stochasticity to a model's output and optimizing expected accuracy, i.e. accuracy of the stochastic model. Extensive experiments on linear models and deep image classification show that the proposed optimization method is a powerful alternative to widely used classification losses.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
CIFAR-10EXACT (WRN-28-10)Percentage correct96.73Unverified
CIFAR-100EXACT (WRN-28-10)Percentage correct82.68Unverified
MNISTEXACT (M3-CNN)Percentage error0.33Unverified
SVHNEXACT (WRN-16-8)Percentage error2.21Unverified

Reproductions