SOTAVerified

Neural Architecture Search

Neural architecture search (NAS) is a technique for automating the design of artificial neural networks (ANN), a widely used model in the field of machine learning. NAS essentially takes the process of a human manually tweaking a neural network and learning what works well, and automates this task to discover more complex architectures.

Image Credit : NAS with Reinforcement Learning

Papers

Showing 18011850 of 1915 papers

TitleStatusHype
A Genetic Programming Approach to Designing Convolutional Neural Network ArchitecturesCode0
Stabilizing DARTS with Amended Gradient Estimation on Architectural ParametersCode0
A Transformer-based Neural Architecture Search MethodCode0
DPNAS: Neural Architecture Search for Deep Learning with Differential PrivacyCode0
TabNAS: Rejection Sampling for Neural Architecture Search on Tabular DatasetsCode0
Resource Constrained Neural Network Architecture Search: Will a Submodularity Assumption Help?Code0
Do Not Train It: A Linear Neural Architecture Search of Graph Neural NetworksCode0
Resource-efficient DNNs for Keyword Spotting using Neural Architecture Search and QuantizationCode0
Seesaw-Net: Convolution Neural Network With Uneven Group ConvolutionCode0
Hierarchical Neural Architecture Search via Operator ClusteringCode0
Distilled Pruning: Using Synthetic Data to Win the LotteryCode0
DiNTS: Differentiable Neural Network Topology Search for 3D Medical Image SegmentationCode0
DiffPrune: Neural Network Pruning with Deterministic Approximate Binary Gates and L_0 RegularizationCode0
Differentially-private Federated Neural Architecture SearchCode0
Rethink DARTS Search Space and Renovate a New BenchmarkCode0
NEAR: A Training-Free Pre-Estimator of Machine Learning Model PerformanceCode0
BATS: Binary ArchitecTure SearchCode0
BatchQuant: Quantized-for-all Architecture Search with Robust QuantizerCode0
BASQ: Branch-wise Activation-clipping Search Quantization for Sub-4-bit Neural NetworksCode0
Band-gap regression with architecture-optimized message-passing neural networksCode0
Standing on the Shoulders of Giants: Hardware and Neural Architecture Co-Search with Hot StartCode0
BAM: Bottleneck Attention ModuleCode0
Network Pruning via Transformable Architecture SearchCode0
Efficient Neural Architecture Search via Proximal IterationsCode0
Balanced Mixture of SuperNets for Learning the CNN Pooling ArchitectureCode0
Backpropagation-Free 4D Continuous Ant-Based Neural Topology SearchCode0
Stochastic Adaptive Neural Architecture Search for Keyword SpottingCode0
Neural Architecture Codesign for Fast Physics ApplicationsCode0
Rethinking the Value of Network PruningCode0
Differentiable Neural Architecture Search in Equivalent Space with Exploration EnhancementCode0
Differentiable NAS Framework and Application to Ads CTR PredictionCode0
Neural Architecture OptimizationCode0
Adaptive hybrid activation function for deep neural networksCode0
UNAS: Differentiable Architecture Search Meets Reinforcement LearningCode0
Differentiable Channel Selection in Self-Attention For Person Re-IdentificationCode0
Towards Learning of Filter-Level Heterogeneous Compression of Convolutional Neural NetworksCode0
DiCENet: Dimension-wise Convolutions for Efficient NetworksCode0
DFG-NAS: Deep and Flexible Graph Neural Architecture SearchCode0
AdversarialNAS: Adversarial Neural Architecture Search for GANsCode0
Neural Architecture Search: A SurveyCode0
Revisiting Multimodal Fusion for 3D Anomaly Detection from an Architectural PerspectiveCode0
Structural Pruning of Pre-trained Language Models via Neural Architecture SearchCode0
B2EA: An Evolutionary Algorithm Assisted by Two Bayesian Optimization Modules for Neural Architecture SearchCode0
Speedy Performance Estimation for Neural Architecture SearchCode0
Structured Pruning and Quantization for Learned Image CompressionCode0
DetNAS: Backbone Search for Object DetectionCode0
Revisiting Training-free NAS Metrics: An Efficient Training-based MethodCode0
Neural Architecture Search for Deep Image PriorCode0
ATOM: Attention Mixer for Efficient Dataset DistillationCode0
A Variational-Sequential Graph Autoencoder for Neural Architecture Performance PredictionCode0
Show:102550
← PrevPage 37 of 39Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1SPOS (ProxylessNAS (GPU) latency)Accuracy75.3Unverified
2SPOS (FBNet-C latency)Accuracy75.1Unverified
3SPOS (block search + channel search)Accuracy74.7Unverified
4MUXNet-xsTop-1 Error Rate33.3Unverified
5FBNetV2-F1Top-1 Error Rate31.7Unverified
6LayerNAS-60MTop-1 Error Rate31Unverified
7NASGEPTop-1 Error Rate29.51Unverified
8MUXNet-sTop-1 Error Rate28.4Unverified
9NN-MASS-ATop-1 Error Rate27.1Unverified
10FBNetV2-F3Top-1 Error Rate26.8Unverified
#ModelMetricClaimedVerifiedStatus
1CR-LSOAccuracy (Test)46.98Unverified
2Shapley-NASAccuracy (Test)46.85Unverified
3β-RDARTS-L2Accuracy (Test)46.71Unverified
4β-SDARTS-RSAccuracy (Test)46.71Unverified
5ASE-NAS+Accuracy (Val)46.66Unverified
6NARAccuracy (Test)46.66Unverified
7BaLeNAS-TFAccuracy (Test)46.54Unverified
8AG-NetAccuracy (Test)46.42Unverified
9Local searchAccuracy (Test)46.38Unverified
10NASBOTAccuracy (Test)46.37Unverified
#ModelMetricClaimedVerifiedStatus
1Balanced MixtureAccuracy (% )91.55Unverified
2GDASTop-1 Error Rate3.4Unverified
3Bonsai-NetTop-1 Error Rate3.35Unverified
4Net2 (2)Top-1 Error Rate3.3Unverified
5μDARTSTop-1 Error Rate3.28Unverified
6NN-MASS- CIFAR-CTop-1 Error Rate3.18Unverified
7NN-MASS- CIFAR-ATop-1 Error Rate3Unverified
8DARTS (first order)Top-1 Error Rate3Unverified
9NASGEPTop-1 Error Rate2.82Unverified
10AlphaX-1 (cutout NASNet)Top-1 Error Rate2.82Unverified