SOTAVerified

Quantization in Spiking Neural Networks

2023-05-13Code Available0· sign in to hype

Bernhard A. Moser, Michael Lunglmayr

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

In spiking neural networks (SNN), at each node, an incoming sequence of weighted Dirac pulses is converted into an output sequence of weighted Dirac pulses by a leaky-integrate-and-fire (LIF) neuron model based on spike aggregation and thresholding. We show that this mapping can be understood as a quantization operator and state a corresponding formula for the quantization error by means of the Alexiewicz norm. This analysis has implications for rethinking re-initialization in the LIF model, leading to the proposal of 'reset-to-mod' as a modulo-based reset variant.

Tasks

Reproductions