Stochastic Pooling for Regularization of Deep Convolutional Neural Networks
2013-01-16Code Available0· sign in to hype
Matthew D. Zeiler, Rob Fergus
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/szagoruyko/imagine-nntorch★ 0
Abstract
We introduce a simple and effective method for regularizing large convolutional neural networks. We replace the conventional deterministic pooling operations with a stochastic procedure, randomly picking the activation within each pooling region according to a multinomial distribution, given by the activities within the pooling region. The approach is hyper-parameter free and can be combined with other regularization approaches, such as dropout and data augmentation. We achieve state-of-the-art performance on four image datasets, relative to other approaches that do not utilize data augmentation.
Tasks
Benchmark Results
| Dataset | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| CIFAR-10 | Stochastic Pooling | Percentage correct | 84.9 | — | Unverified |
| CIFAR-100 | Stochastic Pooling | Percentage correct | 57.5 | — | Unverified |
| SVHN | Stochastic Pooling | Percentage error | 2.8 | — | Unverified |