SOTAVerified

PHNNs: Lightweight Neural Networks via Parameterized Hypercomplex Convolutions

2021-10-08Code Available1· sign in to hype

Eleonora Grassucci, Aston Zhang, Danilo Comminiello

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Hypercomplex neural networks have proven to reduce the overall number of parameters while ensuring valuable performance by leveraging the properties of Clifford algebras. Recently, hypercomplex linear layers have been further improved by involving efficient parameterized Kronecker products. In this paper, we define the parameterization of hypercomplex convolutional layers and introduce the family of parameterized hypercomplex neural networks (PHNNs) that are lightweight and efficient large-scale models. Our method grasps the convolution rules and the filter organization directly from data without requiring a rigidly predefined domain structure to follow. PHNNs are flexible to operate in any user-defined or tuned domain, from 1D to nD regardless of whether the algebra rules are preset. Such a malleability allows processing multidimensional inputs in their natural domain without annexing further dimensions, as done, instead, in quaternion neural networks for 3D inputs like color images. As a result, the proposed family of PHNNs operates with 1/n free parameters as regards its analog in the real domain. We demonstrate the versatility of this approach to multiple domains of application by performing experiments on various image datasets as well as audio datasets in which our method outperforms real and quaternion-valued counterparts. Full code is available at: https://github.com/eleGAN23/HyperNets.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
L3DAS21PHC SEDnet n=2Error Rate0.39Unverified
L3DAS21PHC SEDnet n=4Error Rate0.45Unverified
L3DAS21PHC SEDnet n=16Error Rate0.51Unverified
L3DAS21Quaternion SEDnetError Rate0.52Unverified
L3DAS21PHC SEDnet n=8Error Rate0.56Unverified

Reproductions