SOTAVerified

Neural Architecture Search

Neural architecture search (NAS) is a technique for automating the design of artificial neural networks (ANN), a widely used model in the field of machine learning. NAS essentially takes the process of a human manually tweaking a neural network and learning what works well, and automates this task to discover more complex architectures.

Image Credit : NAS with Reinforcement Learning

Papers

Showing 726750 of 1915 papers

TitleStatusHype
Fast and Practical Neural Architecture SearchCode0
Dynamic Ensemble of Low-fidelity Experts: Mitigating NAS "Cold-Start"Code0
Architecture-Aware Minimization (A^2M): How to Find Flat Minima in Neural Architecture SearchCode0
CHASE: Robust Visual Tracking via Cell-Level Differentiable Neural Architecture SearchCode0
Hierarchical Representations for Efficient Architecture SearchCode0
Band-gap regression with architecture-optimized message-passing neural networksCode0
DVOLVER: Efficient Pareto-Optimal Neural Network Architecture SearchCode0
D-VAE: A Variational Autoencoder for Directed Acyclic GraphsCode0
Neural Predictor for Neural Architecture SearchCode0
DropNAS: Grouped Operation Dropout for Differentiable Architecture SearchCode0
BAM: Bottleneck Attention ModuleCode0
Autoequivariant Network Search via Group DecompositionCode0
Fast Neural Network Adaptation via Parameter Remapping and Architecture SearchCode0
Guided Evolution for Neural Architecture SearchCode0
Hardware Aware Neural Network Architectures using FbNetCode0
DPNAS: Neural Architecture Search for Deep Learning with Differential PrivacyCode0
FBNetV3: Joint Architecture-Recipe Search using Predictor PretrainingCode0
ABC-Di: Approximate Bayesian Computation for Discrete DataCode0
Do Not Train It: A Linear Neural Architecture Search of Graph Neural NetworksCode0
GraphPAS: Parallel Architecture Search for Graph Neural NetworksCode0
GreenMachine: Automatic Design of Zero-Cost Proxies for Energy-Efficient NASCode0
GRAN is superior to GraphRNN: node orderings, kernel- and graph embeddings-based metrics for graph generatorsCode0
Balanced Mixture of SuperNets for Learning the CNN Pooling ArchitectureCode0
Arch-Graph: Acyclic Architecture Relation Predictor for Task-Transferable Neural Architecture SearchCode0
GradSign: Model Performance Inference with Theoretical InsightsCode0
Show:102550
← PrevPage 30 of 77Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1SPOS (ProxylessNAS (GPU) latency)Accuracy75.3Unverified
2SPOS (FBNet-C latency)Accuracy75.1Unverified
3SPOS (block search + channel search)Accuracy74.7Unverified
4MUXNet-xsTop-1 Error Rate33.3Unverified
5FBNetV2-F1Top-1 Error Rate31.7Unverified
6LayerNAS-60MTop-1 Error Rate31Unverified
7NASGEPTop-1 Error Rate29.51Unverified
8MUXNet-sTop-1 Error Rate28.4Unverified
9NN-MASS-ATop-1 Error Rate27.1Unverified
10FBNetV2-F3Top-1 Error Rate26.8Unverified
#ModelMetricClaimedVerifiedStatus
1CR-LSOAccuracy (Test)46.98Unverified
2Shapley-NASAccuracy (Test)46.85Unverified
3β-RDARTS-L2Accuracy (Test)46.71Unverified
4β-SDARTS-RSAccuracy (Test)46.71Unverified
5ASE-NAS+Accuracy (Val)46.66Unverified
6NARAccuracy (Test)46.66Unverified
7BaLeNAS-TFAccuracy (Test)46.54Unverified
8AG-NetAccuracy (Test)46.42Unverified
9Local searchAccuracy (Test)46.38Unverified
10NASBOTAccuracy (Test)46.37Unverified
#ModelMetricClaimedVerifiedStatus
1Balanced MixtureAccuracy (% )91.55Unverified
2GDASTop-1 Error Rate3.4Unverified
3Bonsai-NetTop-1 Error Rate3.35Unverified
4Net2 (2)Top-1 Error Rate3.3Unverified
5μDARTSTop-1 Error Rate3.28Unverified
6NN-MASS- CIFAR-CTop-1 Error Rate3.18Unverified
7NN-MASS- CIFAR-ATop-1 Error Rate3Unverified
8DARTS (first order)Top-1 Error Rate3Unverified
9NASGEPTop-1 Error Rate2.82Unverified
10AlphaX-1 (cutout NASNet)Top-1 Error Rate2.82Unverified