SOTAVerified

Neural Architecture Search

Neural architecture search (NAS) is a technique for automating the design of artificial neural networks (ANN), a widely used model in the field of machine learning. NAS essentially takes the process of a human manually tweaking a neural network and learning what works well, and automates this task to discover more complex architectures.

Image Credit : NAS with Reinforcement Learning

Papers

Showing 301350 of 1915 papers

TitleStatusHype
Equivalence in Deep Neural Networks via Conjugate Matrix EnsemblesCode1
Interpretable Neural Architecture Search via Bayesian Optimisation with Weisfeiler-Lehman KernelsCode1
Does Unsupervised Architecture Representation Learning Help Neural Architecture Search?Code1
NAS-Bench-NLP: Neural Architecture Search Benchmark for Natural Language ProcessingCode1
Few-shot Neural Architecture SearchCode1
Dataset Condensation with Gradient MatchingCode1
Adjoined Networks: A Training Paradigm with Applications to Network CompressionCode1
Neural Architecture Search without TrainingCode1
Overcoming Multi-Model Forgetting in One-Shot NAS With Diversity MaximizationCode1
Block-Wisely Supervised Neural Architecture Search With Knowledge DistillationCode1
HAT: Hardware-Aware Transformers for Efficient Natural Language ProcessingCode1
Synthetic Petri Dish: A Novel Surrogate Model for Rapid Architecture SearchCode1
AOWS: Adaptive and optimal network width search with latency constraintsCode1
Rethinking Performance Estimation in Neural Architecture SearchCode1
Neural Architecture Search for Gliomas Segmentation on Multimodal Magnetic Resonance ImagingCode1
Neural Architecture TransferCode1
Noisy Differentiable Architecture SearchCode1
AutoSpeech: Neural Architecture Search for Speaker RecognitionCode1
Exploring the Loss Landscape in Neural Architecture SearchCode1
Towards Fast Adaptation of Neural Architectures with Meta LearningCode1
Teaching Cameras to Feel: Estimating Tactile Physical Properties of Surfaces From ImagesCode1
Angle-based Search Space Shrinking for Neural Architecture SearchCode1
Deep Multimodal Neural Architecture SearchCode1
Local Search is a Remarkably Strong Baseline for Neural Architecture SearchCode1
Towards Non-I.I.D. and Invisible Data with FedNAS: Federated Deep Learning via Neural Architecture SearchCode1
Geometry-Aware Gradient Algorithms for Neural Architecture SearchCode1
FBNetV2: Differentiable Neural Architecture Search for Spatial and Channel DimensionsCode1
Neural Architecture Search for Lightweight Non-Local NetworksCode1
Neural Architecture Generator OptimizationCode1
MUXConv: Information Multiplexing in Convolutional Neural NetworksCode1
MTL-NAS: Task-Agnostic Neural Architecture Search towards General-Purpose Multi-Task LearningCode1
DHP: Differentiable Meta Pruning via HyperNetworksCode1
CAKES: Channel-wise Automatic KErnel Shrinking for Efficient 3D NetworksCode1
NPENAS: Neural Predictor Guided Evolution for Neural Architecture SearchCode1
MiLeNAS: Efficient Neural Architecture Search via Mixed-Level ReformulationCode1
Hit-Detector: Hierarchical Trinity Architecture Search for Object DetectionCode1
Are Labels Necessary for Neural Architecture Search?Code1
BigNAS: Scaling Up Neural Architecture Search with Big Single-Stage ModelsCode1
AutoSTR: Efficient Backbone Search for Scene Text RecognitionCode1
Hierarchical Neural Architecture Search for Single Image Super-ResolutionCode1
Searching Central Difference Convolutional Networks for Face Anti-SpoofingCode1
Searching for Winograd-aware Quantized NetworksCode1
Semi-Supervised Neural Architecture SearchCode1
Neural Architecture Search for Compressed Sensing Magnetic Resonance Image ReconstructionCode1
DSNAS: Direct Neural Architecture Search without Parameter RetrainingCode1
Knapsack Pruning with Inner DistillationCode1
Training Large Neural Networks with Constant Memory using a New Execution AlgorithmCode1
Stabilizing Differentiable Architecture Search via Perturbation-based RegularizationCode1
Bayesian Neural Architecture Search using A Training-Free Performance MetricCode1
NAS-Bench-1Shot1: Benchmarking and Dissecting One-shot Neural Architecture SearchCode1
Show:102550
← PrevPage 7 of 39Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1SPOS (ProxylessNAS (GPU) latency)Accuracy75.3Unverified
2SPOS (FBNet-C latency)Accuracy75.1Unverified
3SPOS (block search + channel search)Accuracy74.7Unverified
4MUXNet-xsTop-1 Error Rate33.3Unverified
5FBNetV2-F1Top-1 Error Rate31.7Unverified
6LayerNAS-60MTop-1 Error Rate31Unverified
7NASGEPTop-1 Error Rate29.51Unverified
8MUXNet-sTop-1 Error Rate28.4Unverified
9NN-MASS-ATop-1 Error Rate27.1Unverified
10FBNetV2-F3Top-1 Error Rate26.8Unverified
#ModelMetricClaimedVerifiedStatus
1CR-LSOAccuracy (Test)46.98Unverified
2Shapley-NASAccuracy (Test)46.85Unverified
3β-RDARTS-L2Accuracy (Test)46.71Unverified
4β-SDARTS-RSAccuracy (Test)46.71Unverified
5ASE-NAS+Accuracy (Val)46.66Unverified
6NARAccuracy (Test)46.66Unverified
7BaLeNAS-TFAccuracy (Test)46.54Unverified
8AG-NetAccuracy (Test)46.42Unverified
9Local searchAccuracy (Test)46.38Unverified
10NASBOTAccuracy (Test)46.37Unverified
#ModelMetricClaimedVerifiedStatus
1Balanced MixtureAccuracy (% )91.55Unverified
2GDASTop-1 Error Rate3.4Unverified
3Bonsai-NetTop-1 Error Rate3.35Unverified
4Net2 (2)Top-1 Error Rate3.3Unverified
5μDARTSTop-1 Error Rate3.28Unverified
6NN-MASS- CIFAR-CTop-1 Error Rate3.18Unverified
7NN-MASS- CIFAR-ATop-1 Error Rate3Unverified
8DARTS (first order)Top-1 Error Rate3Unverified
9NASGEPTop-1 Error Rate2.82Unverified
10AlphaX-1 (cutout NASNet)Top-1 Error Rate2.82Unverified