SOTAVerified

Data Augmentation

Data augmentation involves techniques used for increasing the amount of data, based on different modifications, to expand the amount of examples in the original dataset. Data augmentation not only helps to grow the dataset but it also increases the diversity of the dataset. When training machine learning models, data augmentation acts as a regularizer and helps to avoid overfitting.

Data augmentation techniques have been found useful in domains like NLP and computer vision. In computer vision, transformations like cropping, flipping, and rotation are used. In NLP, data augmentation techniques can include swapping, deletion, random insertion, among others.

Further readings:

( Image credit: Albumentations )

Papers

Showing 75767600 of 8378 papers

TitleStatusHype
Guiding CTC Posterior Spike Timings for Improved Posterior Fusion and Knowledge Distillation0
Guiding Generative Language Models for Data Augmentation in Few-Shot Text Classification0
HABD: a houma alliance book ancient handwritten character recognition database0
Hallucinations in Neural Machine Translation0
Hallucinations in neural machine translation0
HAMLET: Hierarchical Harmonic Filters for Learning Tracts from Diffusion MRI0
HandDiffuse: Generative Controllers for Two-Hand Interactions via Diffusion Models0
Hand gesture recognition using 802.11ad mmWave sensor in the mobile device0
Handling Climate Change Using Counterfactuals: Using Counterfactuals in Data Augmentation to Predict Crop Growth in an Uncertain Climate Future0
Handling missing data in model-based clustering0
Handling Rare Word Problem using Synthetic Training Data for Sinhala and Tamil Neural Machine Translation0
HandSeg: An Automatically Labeled Dataset for Hand Segmentation from Depth Images0
Handwritten Amharic Character Recognition Using a Convolutional Neural Network0
Handwritten Digit Recognition: An Ensemble-Based Approach for Superior Performance0
Handwritten image augmentation0
Handwritten text generation and strikethrough characters augmentation0
HardCore Generation: Generating Hard UNSAT Problems for Data Augmentation0
HARD: Hard Augmentations for Robust Distillation0
Hard-Synth: Synthesizing Diverse Hard Samples for ASR using Zero-Shot TTS and LLM0
Hardwiring ViT Patch Selectivity into CNNs using Patch Mixing0
Harnessing Hard Mixed Samples with Decoupled Regularizer0
Harnessing The Power of Attention For Patch-Based Biomedical Image Classification0
HARP: Autoregressive Latent Video Prediction with High-Fidelity Image Generator0
HARPT: A Corpus for Analyzing Consumers' Trust and Privacy Concerns in Mobile Health Apps0
HaT5: Hate Language Identification using Text-to-Text Transfer Transformer0
Show:102550
← PrevPage 304 of 336Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1DeiT-B (+MixPro)Accuracy (%)82.9Unverified
2ResNet-200 (DeepAA)Accuracy (%)81.32Unverified
3DeiT-S (+MixPro)Accuracy (%)81.3Unverified
4ResNet-200 (Fast AA)Accuracy (%)80.6Unverified
5ResNet-200 (UA)Accuracy (%)80.4Unverified
6ResNet-200 (AA)Accuracy (%)80Unverified
7ResNet-50 (DeepAA)Accuracy (%)78.3Unverified
8ResNet-50 (TA wide)Accuracy (%)78.07Unverified
9ResNet-50 (LoRot-E)Accuracy (%)77.72Unverified
10ResNet-50 (LoRot-I)Accuracy (%)77.71Unverified
#ModelMetricClaimedVerifiedStatus
1WideResNet-40-2 (Faster AA)Percentage error3.7Unverified
2Shake-Shake (26 2×32d) (Faster AA)Percentage error2.7Unverified
3WideResNet-28-10 (Faster AA)Percentage error2.6Unverified
4Shake-Shake (26 2×96d) (Faster AA)Percentage error2Unverified
5Shake-Shake (26 2×112d) (Faster AA)Percentage error2Unverified
#ModelMetricClaimedVerifiedStatus
1DiffAugClassification Accuracy92.7Unverified
2PaCMAPClassification Accuracy85.3Unverified
3hNNEClassification Accuracy77.4Unverified
4TopoAEClassification Accuracy74.6Unverified