SOTAVerified

HardCoRe-NAS: Hard Constrained diffeRentiable Neural Architecture Search

2021-02-23Code Available1· sign in to hype

Niv Nayman, Yonathan Aflalo, Asaf Noy, Lihi Zelnik-Manor

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Realistic use of neural networks often requires adhering to multiple constraints on latency, energy and memory among others. A popular approach to find fitting networks is through constrained Neural Architecture Search (NAS), however, previous methods enforce the constraint only softly. Therefore, the resulting networks do not exactly adhere to the resource constraint and their accuracy is harmed. In this work we resolve this by introducing Hard Constrained diffeRentiable NAS (HardCoRe-NAS), that is based on an accurate formulation of the expected resource requirement and a scalable search method that satisfies the hard constraint throughout the search. Our experiments show that HardCoRe-NAS generates state-of-the-art architectures, surpassing other NAS methods, while strictly satisfying the hard resource constraints without any tuning required.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
ImageNetHardcoreNAS_E_KDTop-1 Error Rate19.9Unverified
ImageNetHardcoreNAS_D_KDTop-1 Error Rate20.5Unverified
ImageNetHardcoreNAS_C_KDTop-1 Error Rate21.1Unverified
ImageNetHardcoreNAS_B_KDTop-1 Error Rate21.2Unverified
ImageNetHardcoreNAS_A_KDTop-1 Error Rate21.7Unverified
ImageNetHardcoreNAS_FTop-1 Error Rate21.9Unverified
ImageNetHardcoreNAS_ETop-1 Error Rate22.1Unverified
ImageNetHardcoreNAS_DTop-1 Error Rate22.6Unverified
ImageNetHardcoreNAS_CTop-1 Error Rate22.9Unverified
ImageNetHardcoreNAS_BTop-1 Error Rate23.5Unverified
ImageNetHardcoreNAS_ATop-1 Error Rate24.1Unverified

Reproductions