Improving Ranking Correlation of Supernet with Candidates Enhancement and Progressive Training
Ziwei Yang, Ruyi Zhang, Zhi Yang, Xubo Yang, Lei Wang, Zheyang Li
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/tend93/cvpr2021_nas_competition_track1_1st_solutionOfficialIn paperpytorch★ 3
Abstract
One-shot neural architecture search (NAS) applies weight-sharing supernet to reduce the unaffordable computation overhead of automated architecture designing. However, the weight-sharing technique worsens the ranking consistency of performance due to the interferences between different candidate networks. To address this issue, we propose a candidates enhancement method and progressive training pipeline to improve the ranking correlation of supernet. Specifically, we carefully redesign the sub-networks in the supernet and map the original supernet to a new one of high capacity. In addition, we gradually add narrow branches of supernet to reduce the degree of weight sharing which effectively alleviates the mutual interference between sub-networks. Finally, our method ranks the 1st place in the Supernet Track of CVPR2021 1st Lightweight NAS Challenge.