SOTAVerified

Neural Architecture Search

Neural architecture search (NAS) is a technique for automating the design of artificial neural networks (ANN), a widely used model in the field of machine learning. NAS essentially takes the process of a human manually tweaking a neural network and learning what works well, and automates this task to discover more complex architectures.

Image Credit : NAS with Reinforcement Learning

Papers

Showing 15511600 of 1915 papers

TitleStatusHype
IS-DARTS: Stabilizing DARTS through Precise Measurement on Candidate ImportanceCode0
GRAN is superior to GraphRNN: node orderings, kernel- and graph embeddings-based metrics for graph generatorsCode0
GradSign: Model Performance Inference with Theoretical InsightsCode0
ISyNet: Convolutional Neural Networks design for AI acceleratorCode0
Automating Data Science Pipelines with Tensor CompletionCode0
CONet: Channel Optimization for Convolutional Neural NetworksCode0
A Novel Evolutionary Algorithm for Hierarchical Neural Architecture SearchCode0
A Classification of G-invariant Shallow Neural NetworksCode0
Gibbs randomness-compression proposition: An efficient deep learningCode0
Graph-based Neural Architecture Search with Operation EmbeddingsCode0
Operation-level Progressive Differentiable Architecture SearchCode0
GENNAPE: Towards Generalized Neural Architecture Performance EstimatorsCode0
Genetic Network Architecture SearchCode0
Knowledge-aware Evolutionary Graph Neural Architecture SearchCode0
Combinatorial Bayesian Optimization using the Graph Cartesian ProductCode0
Optimal Transport Kernels for Sequential and Parallel Neural Architecture SearchCode0
Landmark Regularization: Ranking Guided Super-Net Training in Neural Architecture SearchCode0
AdvantageNAS: Efficient Neural Architecture Search with Credit AssignmentCode0
Large Language Model Assisted Adversarial Robustness Neural Architecture SearchCode0
Large-Scale Evolution of Image ClassifiersCode0
Searching for TrioNet: Combining Convolution with Local and Global Self-AttentionCode0
Latency-Aware Differentiable Neural Architecture SearchCode0
A Downsampled Variant of ImageNet as an Alternative to the CIFAR datasetsCode0
Automated Dominative Subspace Mining for Efficient Neural Architecture SearchCode0
Optimizing edge AI models on HPC systems with the edge in the loopCode0
Optimizing Neural Architecture Search using Limited GPU Time in a Dynamic Search Space: A Gene Expression Programming ApproachCode0
Λ-DARTS: Mitigating Performance Collapse by Harmonizing Operation Selection among CellsCode0
Learnable Embedding Space for Efficient Neural Architecture CompressionCode0
Learnable Extended Activation Function (LEAF) for Deep Neural NetworksCode0
Learn Basic Skills and Reuse: Modularized Adaptive Neural Architecture Search (MANAS)Code0
Optimizing the Neural Architecture of Reinforcement Learning AgentsCode0
Automatic Generation of Neural Architecture Search SpacesCode0
Language Models with TransformersCode0
Order-Preserving GFlowNetsCode0
Automated Heterogeneous Network learning with Non-Recursive Message PassingCode0
OStr-DARTS: Differentiable Neural Architecture Search based on Operation StrengthCode0
Colab NAS: Obtaining lightweight task-specific convolutional neural networks following Occam's razorCode0
Learning Deep Morphological Networks with Neural Architecture SearchCode0
3DLaneNAS: Neural Architecture Search for Accurate and Light-Weight 3D Lane DetectionCode0
Generative Teaching Networks: Accelerating Neural Architecture Search by Learning to Generate Synthetic Training DataCode0
Generalized Latency Performance Estimation for Once-For-All Neural Architecture SearchCode0
CHASE: Robust Visual Tracking via Cell-Level Differentiable Neural Architecture SearchCode0
Learning from Mistakes -- A Framework for Neural Architecture SearchCode0
Channel-wise Mixed-precision Assignment for DNN Inference on Constrained Edge NodesCode0
Learning Graph Convolutional Network for Skeleton-based Human Action Recognition by Neural SearchingCode0
On Adversarial Robustness: A Neural Architecture Search perspectiveCode0
Learning Implicitly Recurrent CNNs Through Parameter SharingCode0
Learning Interpretable Models Through Multi-Objective Neural Architecture SearchCode0
Parallel Hyperparameter Optimization Of Spiking Neural NetworkCode0
Unifying and Boosting Gradient-Based Training-Free Neural Architecture SearchCode0
Show:102550
← PrevPage 32 of 39Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1SPOS (ProxylessNAS (GPU) latency)Accuracy75.3Unverified
2SPOS (FBNet-C latency)Accuracy75.1Unverified
3SPOS (block search + channel search)Accuracy74.7Unverified
4MUXNet-xsTop-1 Error Rate33.3Unverified
5FBNetV2-F1Top-1 Error Rate31.7Unverified
6LayerNAS-60MTop-1 Error Rate31Unverified
7NASGEPTop-1 Error Rate29.51Unverified
8MUXNet-sTop-1 Error Rate28.4Unverified
9NN-MASS-ATop-1 Error Rate27.1Unverified
10FBNetV2-F3Top-1 Error Rate26.8Unverified
#ModelMetricClaimedVerifiedStatus
1CR-LSOAccuracy (Test)46.98Unverified
2Shapley-NASAccuracy (Test)46.85Unverified
3β-RDARTS-L2Accuracy (Test)46.71Unverified
4β-SDARTS-RSAccuracy (Test)46.71Unverified
5ASE-NAS+Accuracy (Val)46.66Unverified
6NARAccuracy (Test)46.66Unverified
7BaLeNAS-TFAccuracy (Test)46.54Unverified
8AG-NetAccuracy (Test)46.42Unverified
9Local searchAccuracy (Test)46.38Unverified
10NASBOTAccuracy (Test)46.37Unverified
#ModelMetricClaimedVerifiedStatus
1Balanced MixtureAccuracy (% )91.55Unverified
2GDASTop-1 Error Rate3.4Unverified
3Bonsai-NetTop-1 Error Rate3.35Unverified
4Net2 (2)Top-1 Error Rate3.3Unverified
5μDARTSTop-1 Error Rate3.28Unverified
6NN-MASS- CIFAR-CTop-1 Error Rate3.18Unverified
7NN-MASS- CIFAR-ATop-1 Error Rate3Unverified
8DARTS (first order)Top-1 Error Rate3Unverified
9NASGEPTop-1 Error Rate2.82Unverified
10AlphaX-1 (cutout NASNet)Top-1 Error Rate2.82Unverified