SOTAVerified

Model Compression via Symmetries of the Parameter Space

2021-09-29Unverified0· sign in to hype

Iordan Ganev, Robin Walters

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We provide a theoretical framework for neural networks in terms of the representation theory of quivers, thus revealing symmetries of the parameter space of neural networks. An exploitation of these symmetries leads to a model compression algorithm for radial neural networks based on an analogue of the QR decomposition. The algorithm is lossless; the compressed model has the same feedforward function as the original model. If applied before training, optimization of the compressed model by gradient descent is equivalent to a projected version of gradient descent on the original model.

Tasks

Reproductions