SOTAVerified

A Lightweight and Gradient-Stable Neural Layer

2021-06-08Code Available0· sign in to hype

Yueyao Yu, Yin Zhang

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

To enhance resource efficiency and model deployability of neural networks, we propose a neural-layer architecture based on Householder weighting and absolute-value activating, called Householder-absolute neural layer or simply Han-layer. Compared to a fully connected layer with d-neurons and d outputs, a Han-layer reduces the number of parameters and the corresponding computational complexity from O(d^2) to O(d). The Han-layer structure guarantees that the Jacobian of the layer function is always orthogonal, thus ensuring gradient stability (i.e., free of gradient vanishing or exploding issues) for any Han-layer sub-networks. Extensive numerical experiments show that one can strategically use Han-layers to replace fully connected (FC) layers, reducing the number of model parameters while maintaining or even improving the generalization performance. We will also showcase the capabilities of the Han-layer architecture on a few small stylized models, and discuss its current limitations.

Tasks

Reproductions