SOTAVerified

Regularization of Neural Networks using DropConnect

2013-06-13ICML'13: Proceedings of the 30th International Conference on International Conference on Machine Learning - Volume 28 2013Code Available0· sign in to hype

Li Wan, Matthew Zeiler, Sixin Zhang, Yann Lecun, Rob Fergus

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We introduce DropConnect, a generalization of Dropout (Hinton et al., 2012), for regularizing large fully-connected layers within neural networks. When training with Dropout, a randomly selected subset of activations are set to zero within each layer. DropConnect instead sets a randomly selected subset of weights within the network to zero. Each unit thus receives input from a random subset of units in the previous layer. We derive a bound on the generalization performance of both Dropout and DropConnect. We then evaluate DropConnect on a range of datasets, comparing to Dropout, and show state-of-the-art results on several image recognition benchmarks by aggregating multiple DropConnect-trained models.

Tasks

Reproductions