Progressive Neural Architecture Search
Chenxi Liu, Barret Zoph, Maxim Neumann, Jonathon Shlens, Wei Hua, Li-Jia Li, Li Fei-Fei, Alan Yuille, Jonathan Huang, Kevin Murphy
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/chenxi116/PNASNet.pytorchOfficialIn paperpytorch★ 0
- github.com/chenxi116/PNASNet.TFOfficialIn papertf★ 0
- github.com/DataXujing/PNASNet_pytorchpytorch★ 0
- github.com/yashkant/PNAS-Binarized-Neural-Networkstf★ 0
- github.com/Cadene/pretrained-models.pytorchpytorch★ 0
- github.com/Mind23-2/MindCode-5/tree/main/pnasnetmindspore★ 0
- github.com/mindspore-ai/models/tree/master/research/cv/pnasnetmindspore★ 0
- github.com/mindspore-ecosystem/mindcv/blob/main/mindcv/models/pnasnet.pymindspore★ 0
- github.com/Mind23-2/MindCode-107mindspore★ 0
- github.com/titu1994/progressive-neural-architecture-searchtf★ 0
Abstract
We propose a new method for learning the structure of convolutional neural networks (CNNs) that is more efficient than recent state-of-the-art methods based on reinforcement learning and evolutionary algorithms. Our approach uses a sequential model-based optimization (SMBO) strategy, in which we search for structures in order of increasing complexity, while simultaneously learning a surrogate model to guide the search through structure space. Direct comparison under the same search space shows that our method is up to 5 times more efficient than the RL method of Zoph et al. (2018) in terms of number of models evaluated, and 8 times faster in terms of total compute. The structures we discover in this way achieve state of the art classification accuracies on CIFAR-10 and ImageNet.
Tasks
Benchmark Results
| Dataset | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| ImageNet | PNASNet-5 | Top 1 Accuracy | 82.9 | — | Unverified |