SOTAVerified

ProgressiveSpinalNet architecture for FC layers

2021-03-21Code Available0· sign in to hype

Praveen Chopra

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

In deeplearning models the FC (fully connected) layer has biggest important role for classification of the input based on the learned features from previous layers. The FC layers has highest numbers of parameters and fine-tuning these large numbers of parameters, consumes most of the computational resources, so in this paper it is aimed to reduce these large numbers of parameters significantly with improved performance. The motivation is inspired from SpinalNet and other biological architecture. The proposed architecture has a gradient highway between input to output layers and this solves the problem of diminishing gradient in deep networks. In this all the layers receives the input from previous layers as well as the CNN layer output and this way all layers contribute in decision making with last layer. This approach has improved classification performance over the SpinalNet architecture and has SOTA performance on many datasets such as Caltech101, KMNIST, QMNIST and EMNIST. The source code is available at https://github.com/praveenchopra/ProgressiveSpinalNet.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
Bird-225Pre trained wide-resnet-101Accuracy99.55Unverified
Caltech-101Pre trained wide-resnet-101Accuracy97.76Unverified
EMNIST-DigitsVGG-5Accuracy99.82Unverified
EMNIST-LettersVGG-5Accuracy95.86Unverified
Fruits 360Pre trained wide-resnet-101Accuracy99.97Unverified
Kuzushiji-MNISTVGG-5Accuracy98.98Unverified
MNISTVanilla FC layer onlyAccuracy98.19Unverified
QMNISTVGG-5Accuracy99.69Unverified
STL-10Pre trained wide-resnet-101Accuracy98.18Unverified

Reproductions