SOTAVerified

On Iterative Neural Network Pruning, Reinitialization, and the Similarity of Masks

2020-01-14Unverified0· sign in to hype

Michela Paganini, Jessica Forde

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We examine how recently documented, fundamental phenomena in deep learning models subject to pruning are affected by changes in the pruning procedure. Specifically, we analyze differences in the connectivity structure and learning dynamics of pruned models found through a set of common iterative pruning techniques, to address questions of uniqueness of trainable, high-sparsity sub-networks, and their dependence on the chosen pruning method. In convolutional layers, we document the emergence of structure induced by magnitude-based unstructured pruning in conjunction with weight rewinding that resembles the effects of structured pruning. We also show empirical evidence that weight stability can be automatically achieved through apposite pruning techniques.

Tasks

Reproductions