SOTAVerified

Neural Architecture Search

Neural architecture search (NAS) is a technique for automating the design of artificial neural networks (ANN), a widely used model in the field of machine learning. NAS essentially takes the process of a human manually tweaking a neural network and learning what works well, and automates this task to discover more complex architectures.

Image Credit : NAS with Reinforcement Learning

Papers

Showing 451475 of 1915 papers

TitleStatusHype
BINAS: Bilinear Interpretable Neural Architecture SearchCode0
DartsReNet: Exploring new RNN cells in ReNet architecturesCode0
ReFusion: Improving Natural Language Understanding with Computation-Efficient Retrieval Representation FusionCode0
Automated Fusion of Multimodal Electronic Health Records for Better Medical PredictionsCode0
Automated Dominative Subspace Mining for Efficient Neural Architecture SearchCode0
Deep Active Learning with a Neural Architecture SearchCode0
Deep Architecture Connectivity Matters for Its Convergence: A Fine-Grained AnalysisCode0
Deep Bayesian Structure NetworksCode0
Improving Neural Architecture Search by Mixing a FireFly algorithm with a Training Free EvaluationCode0
On Adversarial Robustness: A Neural Architecture Search perspectiveCode0
Improving Neural Architecture Search Image Classifiers via Ensemble LearningCode0
Deeper Insights into Weight Sharing in Neural Architecture SearchCode0
Improved Differentiable Architecture Search for Language Modeling and Named Entity RecognitionCode0
Learnable Extended Activation Function (LEAF) for Deep Neural NetworksCode0
Adaptive Search-and-Training for Robust and Efficient Network PruningCode0
Implantable Adaptive Cells: differentiable architecture search to improve the performance of any trained U-shaped networkCode0
Improve Ranking Correlation of Super-net through Training Scheme from One-shot NAS to Few-shot NASCode0
Improving Neural Networks for Time Series Forecasting using Data Augmentation and AutoMLCode0
Deep Neural Architecture Search with Deep Graph Bayesian OptimizationCode0
Learning Implicitly Recurrent CNNs Through Parameter SharingCode0
AutoLC: Search Lightweight and Top-Performing Architecture for Remote Sensing Image Land-Cover ClassificationCode0
DAIS: Automatic Channel Pruning via Differentiable Annealing Indicator SearchCode0
Model Input-Output Configuration Search with Embedded Feature Selection for Sensor Time-series and Image ClassificationCode0
CycleGANAS: Differentiable Neural Architecture Search for CycleGANCode0
Customized Subgraph Selection and Encoding for Drug-drug Interaction PredictionCode0
Show:102550
← PrevPage 19 of 77Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1SPOS (ProxylessNAS (GPU) latency)Accuracy75.3Unverified
2SPOS (FBNet-C latency)Accuracy75.1Unverified
3SPOS (block search + channel search)Accuracy74.7Unverified
4MUXNet-xsTop-1 Error Rate33.3Unverified
5FBNetV2-F1Top-1 Error Rate31.7Unverified
6LayerNAS-60MTop-1 Error Rate31Unverified
7NASGEPTop-1 Error Rate29.51Unverified
8MUXNet-sTop-1 Error Rate28.4Unverified
9NN-MASS-ATop-1 Error Rate27.1Unverified
10FBNetV2-F3Top-1 Error Rate26.8Unverified
#ModelMetricClaimedVerifiedStatus
1CR-LSOAccuracy (Test)46.98Unverified
2Shapley-NASAccuracy (Test)46.85Unverified
3β-RDARTS-L2Accuracy (Test)46.71Unverified
4β-SDARTS-RSAccuracy (Test)46.71Unverified
5ASE-NAS+Accuracy (Val)46.66Unverified
6NARAccuracy (Test)46.66Unverified
7BaLeNAS-TFAccuracy (Test)46.54Unverified
8AG-NetAccuracy (Test)46.42Unverified
9Local searchAccuracy (Test)46.38Unverified
10NASBOTAccuracy (Test)46.37Unverified
#ModelMetricClaimedVerifiedStatus
1Balanced MixtureAccuracy (% )91.55Unverified
2GDASTop-1 Error Rate3.4Unverified
3Bonsai-NetTop-1 Error Rate3.35Unverified
4Net2 (2)Top-1 Error Rate3.3Unverified
5μDARTSTop-1 Error Rate3.28Unverified
6NN-MASS- CIFAR-CTop-1 Error Rate3.18Unverified
7NN-MASS- CIFAR-ATop-1 Error Rate3Unverified
8DARTS (first order)Top-1 Error Rate3Unverified
9NASGEPTop-1 Error Rate2.82Unverified
10AlphaX-1 (cutout NASNet)Top-1 Error Rate2.82Unverified