SOTAVerified

Dual-Tuning: Joint Prototype Transfer and Structure Regularization for Compatible Feature Learning

2021-08-06Code Available0· sign in to hype

Yan Bai, Jile Jiao, Shengsen Wu, Yihang Lou, Jun Liu, Xuetao Feng, Ling-Yu Duan

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Visual retrieval system faces frequent model update and deployment. It is a heavy workload to re-extract features of the whole database every time.Feature compatibility enables the learned new visual features to be directly compared with the old features stored in the database. In this way, when updating the deployed model, we can bypass the inflexible and time-consuming feature re-extraction process. However, the old feature space that needs to be compatible is not ideal and faces the distribution discrepancy problem with the new space caused by different supervision losses. In this work, we propose a global optimization Dual-Tuning method to obtain feature compatibility against different networks and losses. A feature-level prototype loss is proposed to explicitly align two types of embedding features, by transferring global prototype information. Furthermore, we design a component-level mutual structural regularization to implicitly optimize the feature intrinsic structure. Experimental results on million-scale datasets demonstrate that our Dual-Tuning is able to obtain feature compatibility without sacrificing performance. (Our code will be avaliable at https://github.com/yanbai1993/Dual-Tuning)

Tasks

Reproductions