SOTAVerified

Neural Architecture Search

Neural architecture search (NAS) is a technique for automating the design of artificial neural networks (ANN), a widely used model in the field of machine learning. NAS essentially takes the process of a human manually tweaking a neural network and learning what works well, and automates this task to discover more complex architectures.

Image Credit : NAS with Reinforcement Learning

Papers

Showing 14011450 of 1915 papers

TitleStatusHype
VINNAS: Variational Inference-based Neural Network Architecture Search0
Multi-Modality Information Fusion for Radiomics-based Neural Architecture Search0
An Asymptotically Optimal Multi-Armed Bandit Algorithm and Hyperparameter OptimizationCode1
A Study on Encodings for Neural Architecture SearchCode1
Accuracy Prediction with Non-neural Model for Neural Architecture SearchCode1
Journey Towards Tiny Perceptual Super-ResolutionCode1
NASGEM: Neural Architecture Search via Graph Embedding Method0
Breaking the Curse of Space Explosion: Towards Efficient NAS with Curriculum SearchCode1
Discretization-Aware Architecture SearchCode1
GOLD-NAS: Gradual, One-Level, DifferentiableCode1
Hyperparameter Optimization in Neural Networks via Structured Sparse Recovery0
Multi-Objective Neural Architecture Search Based on Diverse Structures and Adaptive RecommendationCode0
Rethinking Bottleneck Structure for Efficient Mobile Network DesignCode1
Self-supervised Neural Architecture Search0
Surrogate-assisted Particle Swarm Optimisation for Evolving Variable-length Transferable Blocks for Image Classification0
Learning Search Space Partition for Black-box Optimization using Monte Carlo Tree SearchCode1
Towards Automated Neural Interaction Discovery for Click-Through Rate Prediction0
Semi-discrete optimization through semi-discrete optimal transport: a framework for neural architecture searchCode0
Traditional and accelerated gradient descent for neural architecture searchCode0
Neural Architecture Design for GPU-Efficient NetworksCode1
Auto-PyTorch Tabular: Multi-Fidelity MetaLearning for Efficient and Robust AutoDLCode2
NASTransfer: Analyzing Architecture Transferability in Large Scale Neural Architecture Search0
FNA++: Fast Network Adaptation via Parameter Remapping and Architecture SearchCode1
AutoOD: Automated Outlier Detection via Curiosity-guided Search and Self-imitation Learning0
Neural Architecture Optimization with Graph VAE0
DrNAS: Dirichlet Neural Architecture SearchCode1
Cyclic Differentiable Architecture SearchCode1
Fine-Grained Stochastic Architecture SearchCode2
Revealing the Invisible with Model and Data Shrinking for Composite-database Micro-expression Recognition0
Differentially-private Federated Neural Architecture SearchCode0
Fine-Tuning DARTS for Image Classification0
Differentiable Neural Architecture Transformation for Reproducible Architecture Improvement0
Multi-fidelity Neural Architecture Search with Knowledge DistillationCode0
Neural Ensemble Search for Uncertainty Estimation and Dataset ShiftCode1
Inner Ensemble Networks: Average Ensemble as an Effective RegularizerCode0
Equivalence in Deep Neural Networks via Conjugate Matrix EnsemblesCode1
Optimal Transport Kernels for Sequential and Parallel Neural Architecture SearchCode0
Interpretable Neural Architecture Search via Bayesian Optimisation with Weisfeiler-Lehman KernelsCode1
Bonsai-Net: One-Shot Neural Architecture Search via Differentiable PrunersCode0
NAS-Bench-NLP: Neural Architecture Search Benchmark for Natural Language ProcessingCode1
Does Unsupervised Architecture Representation Learning Help Neural Architecture Search?Code1
Few-shot Neural Architecture SearchCode1
Adjoined Networks: A Training Paradigm with Applications to Network CompressionCode1
AMEIR: Automatic Behavior Modeling, Interaction Exploration and MLP Investigation in the Recommender System0
Dataset Condensation with Gradient MatchingCode1
Knowledge Distillation: A Survey0
Speedy Performance Estimation for Neural Architecture SearchCode0
Neural Architecture Search without TrainingCode1
Efficient Architecture Search for Continual Learning0
Conditional Neural Architecture Search0
Show:102550
← PrevPage 29 of 39Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1SPOS (ProxylessNAS (GPU) latency)Accuracy75.3Unverified
2SPOS (FBNet-C latency)Accuracy75.1Unverified
3SPOS (block search + channel search)Accuracy74.7Unverified
4MUXNet-xsTop-1 Error Rate33.3Unverified
5FBNetV2-F1Top-1 Error Rate31.7Unverified
6LayerNAS-60MTop-1 Error Rate31Unverified
7NASGEPTop-1 Error Rate29.51Unverified
8MUXNet-sTop-1 Error Rate28.4Unverified
9NN-MASS-ATop-1 Error Rate27.1Unverified
10FBNetV2-F3Top-1 Error Rate26.8Unverified
#ModelMetricClaimedVerifiedStatus
1CR-LSOAccuracy (Test)46.98Unverified
2Shapley-NASAccuracy (Test)46.85Unverified
3β-RDARTS-L2Accuracy (Test)46.71Unverified
4β-SDARTS-RSAccuracy (Test)46.71Unverified
5ASE-NAS+Accuracy (Val)46.66Unverified
6NARAccuracy (Test)46.66Unverified
7BaLeNAS-TFAccuracy (Test)46.54Unverified
8AG-NetAccuracy (Test)46.42Unverified
9Local searchAccuracy (Test)46.38Unverified
10NASBOTAccuracy (Test)46.37Unverified
#ModelMetricClaimedVerifiedStatus
1Balanced MixtureAccuracy (% )91.55Unverified
2GDASTop-1 Error Rate3.4Unverified
3Bonsai-NetTop-1 Error Rate3.35Unverified
4Net2 (2)Top-1 Error Rate3.3Unverified
5μDARTSTop-1 Error Rate3.28Unverified
6NN-MASS- CIFAR-CTop-1 Error Rate3.18Unverified
7NN-MASS- CIFAR-ATop-1 Error Rate3Unverified
8DARTS (first order)Top-1 Error Rate3Unverified
9NASGEPTop-1 Error Rate2.82Unverified
10AlphaX-1 (cutout NASNet)Top-1 Error Rate2.82Unverified