SOTAVerified

Data Augmentation

Data augmentation involves techniques used for increasing the amount of data, based on different modifications, to expand the amount of examples in the original dataset. Data augmentation not only helps to grow the dataset but it also increases the diversity of the dataset. When training machine learning models, data augmentation acts as a regularizer and helps to avoid overfitting.

Data augmentation techniques have been found useful in domains like NLP and computer vision. In computer vision, transformations like cropping, flipping, and rotation are used. In NLP, data augmentation techniques can include swapping, deletion, random insertion, among others.

Further readings:

( Image credit: Albumentations )

Papers

Showing 20262050 of 8378 papers

TitleStatusHype
Causal Information Prioritization for Efficient Reinforcement Learning0
Rao-Blackwell Gradient Estimators for Equivariant Denoising Diffusion0
Towards Understanding Why Data Augmentation Improves Generalization0
Generalizability through Explainability: Countering Overfitting with Counterfactual Examples0
DICE: Device-level Integrated Circuits Encoder with Graph Contrastive PretrainingCode0
Low-Resolution Neural Networks0
Foliar Uptake of Biocides: Statistical Assessment of Compartmental and Diffusion-Based Models0
Advancing machine fault diagnosis: A detailed examination of convolutional neural networks0
Robustly Learning Monotone Generalized Linear Models via Data Augmentation0
Data Augmentation to Improve Large Language Models in Food Hazard and Product DetectionCode0
Federated Self-supervised Domain Generalization for Label-efficient Polyp Segmentation0
Towards Understanding of Frequency Dependence on Sound Event Detection0
An Advanced NLP Framework for Automated Medical Diagnosis with DeBERTa and Dynamic Contextual Positional Gating0
Data Augmentation and Regularization for Learning Group EquivarianceCode0
A Framework for Supervised and Unsupervised Segmentation and Classification of Materials Microstructure Images0
Universality of High-Dimensional Logistic Regression and a Novel CGMT under Dependence with Applications to Data Augmentation0
Boost-and-Skip: A Simple Guidance-Free Diffusion for Minority GenerationCode0
AI-Driven HSI: Multimodality, Fusion, Challenges, and the Deep Learning Revolution0
WatchGuardian: Enabling User-Defined Personalized Just-in-Time Intervention on Smartwatch0
DCENWCNet: A Deep CNN Ensemble Network for White Blood Cell Classification with LIME-Based Explainability0
Diagonal Symmetrization of Neural Network Solvers for the Many-Electron Schrödinger Equation0
NextBestPath: Efficient 3D Mapping of Unseen Environments0
Importance Sampling via Score-based Generative Models0
Graph Contrastive Learning for Connectome ClassificationCode0
YOLOv4: A Breakthrough in Real-Time Object Detection0
Show:102550
← PrevPage 82 of 336Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1DeiT-B (+MixPro)Accuracy (%)82.9Unverified
2ResNet-200 (DeepAA)Accuracy (%)81.32Unverified
3DeiT-S (+MixPro)Accuracy (%)81.3Unverified
4ResNet-200 (Fast AA)Accuracy (%)80.6Unverified
5ResNet-200 (UA)Accuracy (%)80.4Unverified
6ResNet-200 (AA)Accuracy (%)80Unverified
7ResNet-50 (DeepAA)Accuracy (%)78.3Unverified
8ResNet-50 (TA wide)Accuracy (%)78.07Unverified
9ResNet-50 (LoRot-E)Accuracy (%)77.72Unverified
10ResNet-50 (LoRot-I)Accuracy (%)77.71Unverified
#ModelMetricClaimedVerifiedStatus
1WideResNet-40-2 (Faster AA)Percentage error3.7Unverified
2Shake-Shake (26 2×32d) (Faster AA)Percentage error2.7Unverified
3WideResNet-28-10 (Faster AA)Percentage error2.6Unverified
4Shake-Shake (26 2×112d) (Faster AA)Percentage error2Unverified
5Shake-Shake (26 2×96d) (Faster AA)Percentage error2Unverified
#ModelMetricClaimedVerifiedStatus
1DiffAugClassification Accuracy92.7Unverified
2PaCMAPClassification Accuracy85.3Unverified
3hNNEClassification Accuracy77.4Unverified
4TopoAEClassification Accuracy74.6Unverified