SOTAVerified

Neural Architecture Search

Neural architecture search (NAS) is a technique for automating the design of artificial neural networks (ANN), a widely used model in the field of machine learning. NAS essentially takes the process of a human manually tweaking a neural network and learning what works well, and automates this task to discover more complex architectures.

Image Credit : NAS with Reinforcement Learning

Papers

Showing 12761300 of 1915 papers

TitleStatusHype
Towards a Robust Differentiable Architecture Search under Label Noise0
Towards Assessing the Impact of Bayesian Optimization's Own Hyperparameters0
Towards Automated Neural Interaction Discovery for Click-Through Rate Prediction0
Automated Search-Space Generation Neural Architecture Search0
Towards Bi-directional Skip Connections in Encoder-Decoder Architectures and Beyond0
Towards Cardiac Intervention Assistance: Hardware-aware Neural Architecture Exploration for Real-Time 3D Cardiac Cine MRI Segmentation0
Towards Improving the Consistency, Efficiency, and Flexibility of Differentiable Neural Architecture Search0
Towards Interpretable Physical-Conceptual Catchment-Scale Hydrological Modeling using the Mass-Conserving-Perceptron0
CiMNet: Towards Joint Optimization for DNN Architecture and Configuration for Compute-In-Memory Hardware0
Towards Lightweight Graph Neural Network Search with Curriculum Graph Sparsification0
Towards Neural Architecture Search for Transfer Learning in 6G Networks0
Towards One Shot Search Space Poisoning in Neural Architecture Search0
Towards Optimal Compression: Joint Pruning and Quantization0
Towards Oracle Knowledge Distillation with Neural Architecture Search0
Towards Privacy-Preserving Neural Architecture Search0
Towards Regression-Free Neural Networks for Diverse Compute Platforms0
Towards Robust Out-of-Distribution Generalization: Data Augmentation and Neural Architecture Search Approaches0
Towards Tailored Models on Private AIoT Devices: Federated Direct Neural Architecture Search0
TRACE: Tensorizing and Generalizing Supernets from Neural Architecture Search0
Training BatchNorm Only in Neural Architecture Search and Beyond0
Training-free Neural Architecture Search through Variance of Knowledge of Deep Network Weights0
Trainless Model Performance Estimation for Neural Architecture Search0
TrajectoryNAS: A Neural Architecture Search for Trajectory Prediction0
TransBO: Hyperparameter Optimization via Two-Phase Transfer Learning0
Transfer Learning based Search Space Design for Hyperparameter Tuning0
Show:102550
← PrevPage 52 of 77Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1SPOS (ProxylessNAS (GPU) latency)Accuracy75.3Unverified
2SPOS (FBNet-C latency)Accuracy75.1Unverified
3SPOS (block search + channel search)Accuracy74.7Unverified
4MUXNet-xsTop-1 Error Rate33.3Unverified
5FBNetV2-F1Top-1 Error Rate31.7Unverified
6LayerNAS-60MTop-1 Error Rate31Unverified
7NASGEPTop-1 Error Rate29.51Unverified
8MUXNet-sTop-1 Error Rate28.4Unverified
9NN-MASS-ATop-1 Error Rate27.1Unverified
10FBNetV2-F3Top-1 Error Rate26.8Unverified
#ModelMetricClaimedVerifiedStatus
1CR-LSOAccuracy (Test)46.98Unverified
2Shapley-NASAccuracy (Test)46.85Unverified
3β-RDARTS-L2Accuracy (Test)46.71Unverified
4β-SDARTS-RSAccuracy (Test)46.71Unverified
5ASE-NAS+Accuracy (Val)46.66Unverified
6NARAccuracy (Test)46.66Unverified
7BaLeNAS-TFAccuracy (Test)46.54Unverified
8AG-NetAccuracy (Test)46.42Unverified
9Local searchAccuracy (Test)46.38Unverified
10NASBOTAccuracy (Test)46.37Unverified
#ModelMetricClaimedVerifiedStatus
1Balanced MixtureAccuracy (% )91.55Unverified
2GDASTop-1 Error Rate3.4Unverified
3Bonsai-NetTop-1 Error Rate3.35Unverified
4Net2 (2)Top-1 Error Rate3.3Unverified
5μDARTSTop-1 Error Rate3.28Unverified
6NN-MASS- CIFAR-CTop-1 Error Rate3.18Unverified
7NN-MASS- CIFAR-ATop-1 Error Rate3Unverified
8DARTS (first order)Top-1 Error Rate3Unverified
9NASGEPTop-1 Error Rate2.82Unverified
10AlphaX-1 (cutout NASNet)Top-1 Error Rate2.82Unverified