SOTAVerified

Tiny Updater: Towards Efficient Neural Network-Driven Software Updating

2023-01-01ICCV 2023Code Available0· sign in to hype

Linfeng Zhang, Kaisheng Ma

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Significant advancements have been accomplished with deep neural networks in diverse visual tasks, which have substantially elevated their deployment in edge device software. However, during the update of neural network-based software, users are required to download all the parameters of the neural network anew, which harms the user experience. Motivated by previous progress in model compression, we propose a novel training methodology named Tiny Updater to address this issue. Specifically, by adopting the variant of pruning and knowledge distillation methods, Tiny Updater can update the neural network-based software by only downloading a few parameters (10% 20%) instead of all the parameters in the neural network. Experiments on eleven datasets of three tasks, including image classification, image-to-image translation, and video recognition have demonstrated its effectiveness. Codes have been released in https://github.com/ArchipLab-LinfengZhang/TinyUpdater.

Tasks

Reproductions