Activation Functions for Generalized Learning Vector Quantization - A Performance Comparison
2019-01-17Unverified0· sign in to hype
Thomas Villmann, John Ravichandran, Andrea Villmann, David Nebel, Marika Kaden
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
An appropriate choice of the activation function (like ReLU, sigmoid or swish) plays an important role in the performance of (deep) multilayer perceptrons (MLP) for classification and regression learning. Prototype-based classification learning methods like (generalized) learning vector quantization (GLVQ) are powerful alternatives. These models also deal with activation functions but here they are applied to the so-called classifier function instead. In this paper we investigate successful candidates of activation functions known for MLPs for application in GLVQ and their influence on the performance.