SOTAVerified

Extended Batch Normalization

2020-03-12Unverified0· sign in to hype

Chunjie Luo, Jianfeng Zhan, Lei Wang, Wanling Gao

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Batch normalization (BN) has become a standard technique for training the modern deep networks. However, its effectiveness diminishes when the batch size becomes smaller, since the batch statistics estimation becomes inaccurate. That hinders batch normalization's usage for 1) training larger model which requires small batches constrained by memory consumption, 2) training on mobile or embedded devices of which the memory resource is limited. In this paper, we propose a simple but effective method, called extended batch normalization (EBN). For NCHW format feature maps, extended batch normalization computes the mean along the (N, H, W) dimensions, as the same as batch normalization, to maintain the advantage of batch normalization. To alleviate the problem caused by small batch size, extended batch normalization computes the standard deviation along the (N, C, H, W) dimensions, thus enlarges the number of samples from which the standard deviation is computed. We compare extended batch normalization with batch normalization and group normalization on the datasets of MNIST, CIFAR-10/100, STL-10, and ImageNet, respectively. The experiments show that extended batch normalization alleviates the problem of batch normalization with small batch size while achieving close performances to batch normalization with large batch size.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
STL-10ResNet18(BN, 4)Percentage correct81.04Unverified
STL-10ResNet18(GN, 4)Percentage correct79.3Unverified
STL-10ResNet18(BN, 128)Percentage correct78.65Unverified
STL-10ResNet18(EBN, 4)Percentage correct76.49Unverified
STL-10ResNet18(EBN, 128)Percentage correct75.57Unverified
STL-10ResNet18(GN, 128)Percentage correct72.66Unverified

Reproductions