SOTAVerified

Neural Architecture Search

Neural architecture search (NAS) is a technique for automating the design of artificial neural networks (ANN), a widely used model in the field of machine learning. NAS essentially takes the process of a human manually tweaking a neural network and learning what works well, and automates this task to discover more complex architectures.

Image Credit : NAS with Reinforcement Learning

Papers

Showing 151200 of 1915 papers

TitleStatusHype
Automated Concatenation of Embeddings for Structured PredictionCode1
Automated Machine Learning on Graphs: A SurveyCode1
DATA: Domain-Aware and Task-Aware Self-supervised LearningCode1
Dataset Condensation with Distribution MatchingCode1
CAKES: Channel-wise Automatic KErnel Shrinking for Efficient 3D NetworksCode1
A Study on Encodings for Neural Architecture SearchCode1
ChamNet: Towards Efficient Network Design through Platform-Aware Model AdaptationCode1
Deep Multimodal Neural Architecture SearchCode1
BossNAS: Exploring Hybrid CNN-transformers with Block-wisely Self-supervised Neural Architecture SearchCode1
DHP: Differentiable Meta Pruning via HyperNetworksCode1
Differentiable Neural Architecture Learning for Efficient Neural Network DesignCode1
Differential Evolution for Neural Architecture SearchCode1
BN-NAS: Neural Architecture Search with Batch NormalizationCode1
Does Unsupervised Architecture Representation Learning Help Neural Architecture Search?Code1
DSNAS: Direct Neural Architecture Search without Parameter RetrainingCode1
DU-DARTS: Decreasing the Uncertainty of Differentiable Architecture SearchCode1
Bounce: Reliable High-Dimensional Bayesian Optimization for Combinatorial and Mixed SpacesCode1
Blockwisely Supervised Neural Architecture Search with Knowledge DistillationCode1
Block-Wisely Supervised Neural Architecture Search With Knowledge DistillationCode1
Efficient Forward Architecture SearchCode1
BM-NAS: Bilevel Multimodal Neural Architecture SearchCode1
Automated Search for Resource-Efficient Branched Multi-Task NetworksCode1
Efficient Neural Architecture Search for End-to-end Speech Recognition via Straight-Through GradientsCode1
Efficient Neural Architecture Search via Parameter SharingCode1
AttentiveNAS: Improving Neural Architecture Search via Attentive SamplingCode1
Aligned Structured Sparsity Learning for Efficient Image Super-ResolutionCode1
Accelerating Evolutionary Neural Architecture Search via Multi-Fidelity EvaluationCode1
ElasticViT: Conflict-aware Supernet Training for Deploying Fast Vision Transformer on Diverse Mobile DevicesCode1
EC-NAS: Energy Consumption Aware Tabular Benchmarks for Neural Architecture SearchCode1
Enhancing Neural Architecture Search with Multiple Hardware Constraints for Deep Learning Model Deployment on Tiny IoT DevicesCode1
Breaking the Curse of Space Explosion: Towards Efficient NAS with Curriculum SearchCode1
Equivalence in Deep Neural Networks via Conjugate Matrix EnsemblesCode1
Neural Architecture Search using Deep Neural Networks and Monte Carlo Tree SearchCode1
Evolutionary Neural Cascade Search across SupernetworksCode1
EvoPose2D: Pushing the Boundaries of 2D Human Pose Estimation using Accelerated Neuroevolution with Weight TransferCode1
Exploring Relational Context for Multi-Task Dense PredictionCode1
Adaptive Cross-Layer Attention for Image RestorationCode1
AutoML: A Survey of the State-of-the-ArtCode1
CLEARER: Multi-Scale Neural Architecture Search for Image RestorationCode1
AutoML4ETC: Automated Neural Architecture Search for Real-World Encrypted Traffic ClassificationCode1
β-DARTS++: Bi-level Regularization for Proxy-robust Differentiable Architecture SearchCode1
AutoPEFT: Automatic Configuration Search for Parameter-Efficient Fine-TuningCode1
FlowNAS: Neural Architecture Search for Optical Flow EstimationCode1
FNA++: Fast Network Adaptation via Parameter Remapping and Architecture SearchCode1
AdvRush: Searching for Adversarially Robust Neural ArchitecturesCode1
Adjoined Networks: A Training Paradigm with Applications to Network CompressionCode1
β-DARTS: Beta-Decay Regularization for Differentiable Architecture SearchCode1
AutoGL: A Library for Automated Graph LearningCode1
AOWS: Adaptive and optimal network width search with latency constraintsCode1
b-DARTS: Beta-Decay Regularization for Differentiable Architecture SearchCode1
Show:102550
← PrevPage 4 of 39Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1SPOS (ProxylessNAS (GPU) latency)Accuracy75.3Unverified
2SPOS (FBNet-C latency)Accuracy75.1Unverified
3SPOS (block search + channel search)Accuracy74.7Unverified
4MUXNet-xsTop-1 Error Rate33.3Unverified
5FBNetV2-F1Top-1 Error Rate31.7Unverified
6LayerNAS-60MTop-1 Error Rate31Unverified
7NASGEPTop-1 Error Rate29.51Unverified
8MUXNet-sTop-1 Error Rate28.4Unverified
9NN-MASS-ATop-1 Error Rate27.1Unverified
10FBNetV2-F3Top-1 Error Rate26.8Unverified
#ModelMetricClaimedVerifiedStatus
1CR-LSOAccuracy (Test)46.98Unverified
2Shapley-NASAccuracy (Test)46.85Unverified
3β-RDARTS-L2Accuracy (Test)46.71Unverified
4β-SDARTS-RSAccuracy (Test)46.71Unverified
5ASE-NAS+Accuracy (Val)46.66Unverified
6NARAccuracy (Test)46.66Unverified
7BaLeNAS-TFAccuracy (Test)46.54Unverified
8AG-NetAccuracy (Test)46.42Unverified
9Local searchAccuracy (Test)46.38Unverified
10NASBOTAccuracy (Test)46.37Unverified
#ModelMetricClaimedVerifiedStatus
1Balanced MixtureAccuracy (% )91.55Unverified
2GDASTop-1 Error Rate3.4Unverified
3Bonsai-NetTop-1 Error Rate3.35Unverified
4Net2 (2)Top-1 Error Rate3.3Unverified
5μDARTSTop-1 Error Rate3.28Unverified
6NN-MASS- CIFAR-CTop-1 Error Rate3.18Unverified
7NN-MASS- CIFAR-ATop-1 Error Rate3Unverified
8DARTS (first order)Top-1 Error Rate3Unverified
9NASGEPTop-1 Error Rate2.82Unverified
10AlphaX-1 (cutout NASNet)Top-1 Error Rate2.82Unverified