SOTAVerified

P-Swish: Activation Function with Learnable Parameters Based on Swish Activation Function in Deep Learning

2021-01-01Unverified0· sign in to hype

Marina Adriana Mercioni, Stefan Holban

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

In order to improve the performance of a deep neural network, the activation function is an important aspect that we must research continuously, that is why we have expanded the research in this direction. We introduced a novel P-Swish activation function (Parametric Swish), which is able to bring performance improvements on object classification tasks using datasets such as CIFAR-10, CIFAR-100, but we will see that we also used datasets for Natural Language Processing (NLP). To test it, we used several types of architectures, including LeNet-5, Network in Network (NiN), and ResNet34 compared to popular activation functions such as sigmoid, ReLU, Swish, and our proposals. In particular, the P-Swish function facilitates fast network training, which makes it suitable for the Transfer Learning technique.

Tasks

Reproductions