SOTAVerified

TeLU: A New Activation Function for Deep Learning

2021-01-01Unverified0· sign in to hype

Marina Adriana Mercioni, Stefan Holban

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

In this paper we proposed two novel activation functions, which we called them TeLU and TeLU learnable. These proposals are a combination of ReLU (Rectified Linear Unit), tangent(tanh), and ELU (Exponential Linear Units) without and with a learnable parameter. We prove that the activation functions TeLU and TeLU learnable give better results than other popular activation functions, including ReLU, Mish, TanhExp, using current architectures tested on Computer Vision datasets.

Tasks

Reproductions