SOTAVerified

Data Augmentation

Data augmentation involves techniques used for increasing the amount of data, based on different modifications, to expand the amount of examples in the original dataset. Data augmentation not only helps to grow the dataset but it also increases the diversity of the dataset. When training machine learning models, data augmentation acts as a regularizer and helps to avoid overfitting.

Data augmentation techniques have been found useful in domains like NLP and computer vision. In computer vision, transformations like cropping, flipping, and rotation are used. In NLP, data augmentation techniques can include swapping, deletion, random insertion, among others.

Further readings:

( Image credit: Albumentations )

Papers

Showing 38013825 of 8378 papers

TitleStatusHype
Handling Rare Word Problem using Synthetic Training Data for Sinhala and Tamil Neural Machine Translation0
Cross-speaker style transfer for text-to-speech using data augmentation0
HandSeg: An Automatically Labeled Dataset for Hand Segmentation from Depth Images0
Handwritten Amharic Character Recognition Using a Convolutional Neural Network0
DiffStitch: Boosting Offline Reinforcement Learning with Diffusion-based Trajectory Stitching0
Handwritten image augmentation0
Boosting Adversarial Transferability with Spatial Adversarial Alignment0
DiffPop: Plausibility-Guided Object Placement Diffusion for Image Composition0
An improved EfficientNetV2 for garbage classification0
Diff-Lung: Diffusion-Based Texture Synthesis for Enhanced Pathological Tissue Segmentation in Lung CT Scans0
HARD: Hard Augmentations for Robust Distillation0
Boosted EfficientNet: Detection of Lymph Node Metastases in Breast Cancer Using Convolutional Neural Network0
Dataset of Random Relaxations for Crystal Structure Search of Li-Si System0
Hard-Synth: Synthesizing Diverse Hard Samples for ASR using Zero-Shot TTS and LLM0
diffIRM: A Diffusion-Augmented Invariant Risk Minimization Framework for Spatiotemporal Prediction over Graphs0
Hardwiring ViT Patch Selectivity into CNNs using Patch Mixing0
Differential Expression Analysis of Dynamical Sequencing Count Data with a Gamma Markov Chain0
An Improved Deep Learning Approach For Product Recognition on Racks in Retail Stores0
CSSL: Contrastive Self-Supervised Learning for Dependency Parsing on Relatively Free Word Ordered and Morphologically Rich Low Resource Languages0
Harnessing Hard Mixed Samples with Decoupled Regularizer0
CST5: Data augmentation for Code-Switched Semantic Parsing0
Harnessing The Power of Attention For Patch-Based Biomedical Image Classification0
Advancing Stuttering Detection via Data Augmentation, Class-Balanced Loss and Multi-Contextual Deep Learning0
HARPT: A Corpus for Analyzing Consumers' Trust and Privacy Concerns in Mobile Health Apps0
Impossible Triangle: What's Next for Pre-trained Language Models?0
Show:102550
← PrevPage 153 of 336Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1DeiT-B (+MixPro)Accuracy (%)82.9Unverified
2ResNet-200 (DeepAA)Accuracy (%)81.32Unverified
3DeiT-S (+MixPro)Accuracy (%)81.3Unverified
4ResNet-200 (Fast AA)Accuracy (%)80.6Unverified
5ResNet-200 (UA)Accuracy (%)80.4Unverified
6ResNet-200 (AA)Accuracy (%)80Unverified
7ResNet-50 (DeepAA)Accuracy (%)78.3Unverified
8ResNet-50 (TA wide)Accuracy (%)78.07Unverified
9ResNet-50 (LoRot-E)Accuracy (%)77.72Unverified
10ResNet-50 (LoRot-I)Accuracy (%)77.71Unverified
#ModelMetricClaimedVerifiedStatus
1WideResNet-40-2 (Faster AA)Percentage error3.7Unverified
2Shake-Shake (26 2×32d) (Faster AA)Percentage error2.7Unverified
3WideResNet-28-10 (Faster AA)Percentage error2.6Unverified
4Shake-Shake (26 2×112d) (Faster AA)Percentage error2Unverified
5Shake-Shake (26 2×96d) (Faster AA)Percentage error2Unverified
#ModelMetricClaimedVerifiedStatus
1DiffAugClassification Accuracy92.7Unverified
2PaCMAPClassification Accuracy85.3Unverified
3hNNEClassification Accuracy77.4Unverified
4TopoAEClassification Accuracy74.6Unverified