SOTAVerified

LayerNAS: Neural Architecture Search in Polynomial Complexity

2023-04-23Unverified0· sign in to hype

Yicheng Fan, Dana Alon, Jingyue Shen, Daiyi Peng, Keshav Kumar, Yun Long, Xin Wang, Fotis Iliopoulos, Da-Cheng Juan, Erik Vee

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Neural Architecture Search (NAS) has become a popular method for discovering effective model architectures, especially for target hardware. As such, NAS methods that find optimal architectures under constraints are essential. In our paper, we propose LayerNAS to address the challenge of multi-objective NAS by transforming it into a combinatorial optimization problem, which effectively constrains the search complexity to be polynomial. For a model architecture with L layers, we perform layerwise-search for each layer, selecting from a set of search options S. LayerNAS groups model candidates based on one objective, such as model size or latency, and searches for the optimal model based on another objective, thereby splitting the cost and reward elements of the search. This approach limits the search complexity to O(H |S| L) , where H is a constant set in LayerNAS. Our experiments show that LayerNAS is able to consistently discover superior models across a variety of search spaces in comparison to strong baselines, including search spaces derived from NATS-Bench, MobileNetV2 and MobileNetV3.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
ImageNetLayerNAS-300MTop-1 Error Rate22.9Unverified
ImageNetLayerNAS-60MTop-1 Error Rate31Unverified
ImageNetLayerNAS-220MTop-1 Error Rate24.4Unverified
ImageNetLayerNAS-600MTop-1 Error Rate21.4Unverified
NAS-Bench-101LayerNASAccuracy (%)94.26Unverified
NATS-Bench Size, CIFAR-10LayerNASTest Accuracy93.2Unverified
NATS-Bench Size, CIFAR-100LayerNASTest Accuracy70.64Unverified
NATS-Bench Size, ImageNet16-120LayerNASTest Accuracy45.37Unverified

Reproductions