SOTAVerified

Neural Architecture Search

Neural architecture search (NAS) is a technique for automating the design of artificial neural networks (ANN), a widely used model in the field of machine learning. NAS essentially takes the process of a human manually tweaking a neural network and learning what works well, and automates this task to discover more complex architectures.

Image Credit : NAS with Reinforcement Learning

Papers

Showing 110 of 1915 papers

TitleStatusHype
DASViT: Differentiable Architecture Search for Vision Transformer0
AnalogNAS-Bench: A NAS Benchmark for Analog In-Memory ComputingCode2
From Tiny Machine Learning to Tiny Deep Learning: A SurveyCode2
DDS-NAS: Dynamic Data Selection within Neural Architecture Search via On-line Hard Example Mining applied to Image Classification0
One-Shot Neural Architecture Search with Network Similarity Directed Initialization for Pathological Image Classification0
Finding Optimal Kernel Size and Dimension in Convolutional Neural Networks An Architecture Optimization Approach0
MARCO: Hardware-Aware Neural Architecture Search for Edge Devices with Multi-Agent Reinforcement Learning and Conformal Prediction Filtering0
Directed Acyclic Graph Convolutional Networks0
Efficient Traffic Classification using HW-NAS: Advanced Analysis and Optimization for Cybersecurity on Resource-Constrained Devices0
Energy-Efficient Deep Learning for Traffic Classification on Microcontrollers0
Show:102550
← PrevPage 1 of 192Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1Balanced MixtureAccuracy (% )79.61Unverified
2μDARTSPercentage Error19.39Unverified
3NASGEPPercentage Error18.83Unverified
4DARTS-PRIMEPercentage Error17.44Unverified
5DU-DARTSPercentage Error16.74Unverified
6β-DARTSPercentage Error16.52Unverified
7ZenNet-2.0MPercentage Error15.6Unverified
8NAT-M1Percentage Error14Unverified
9MUXNet-mPercentage Error13.9Unverified
10NAT-M2Percentage Error12.5Unverified