SOTAVerified

A scalable convolutional neural network for task-specified scenarios via knowledge distillation

2016-09-19Unverified0· sign in to hype

Mengnan Shi, Fei Qin, Qixiang Ye, Zhenjun Han, Jianbin Jiao

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

In this paper, we explore the redundancy in convolutional neural network, which scales with the complexity of vision tasks. Considering that many front-end visual systems are interested in only a limited range of visual targets, the removing of task-specified network redundancy can promote a wide range of potential applications. We propose a task-specified knowledge distillation algorithm to derive a simplified model with pre-set computation cost and minimized accuracy loss, which suits the resource constraint front-end systems well. Experiments on the MNIST and CIFAR10 datasets demonstrate the feasibility of the proposed approach as well as the existence of task-specified redundancy.

Tasks

Reproductions