SOTAVerified

Diversity Based Edge Pruning of Neural Networks Using Determinantal Point Processes

2021-03-04ICLR Workshop Neural_Compression 2021Unverified0· sign in to hype

Rupam Acharyya, Boyu Zhang, Ankani Chattoraj, Shouman Das, Daniel Stefankovic

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Deep learning architectures with huge number of parameters are often compressed using pruning techniques. Two classes of pruning techniques are node pruning and edge pruning. A fairly recent work established that Determinantal Point Process (DPP) based node pruning empirically outperforms competing node pruning methods. However, one prominent appeal of edge pruning over node pruning is the consistent finding in literature is that sparse neural networks (edge pruned) generalize better than dense neural networks (node pruned). Building on these previous work and drawing motivation from synaptic diversity in the brain, we propose a novel diversity-based edge pruning technique for neural networks using DPP. We then empirically show that DPP edge pruning for neural networks outperforms other competing methods (both edge and node) on real data.

Tasks

Reproductions