SOTAVerified

Data Augmentation

Data augmentation involves techniques used for increasing the amount of data, based on different modifications, to expand the amount of examples in the original dataset. Data augmentation not only helps to grow the dataset but it also increases the diversity of the dataset. When training machine learning models, data augmentation acts as a regularizer and helps to avoid overfitting.

Data augmentation techniques have been found useful in domains like NLP and computer vision. In computer vision, transformations like cropping, flipping, and rotation are used. In NLP, data augmentation techniques can include swapping, deletion, random insertion, among others.

Further readings:

( Image credit: Albumentations )

Papers

Showing 63766400 of 8378 papers

TitleStatusHype
Medical Scientific Table-to-Text Generation with Human-in-the-Loop under the Data Sparsity Constraint0
Medication Mention Detection in Tweets Using ELECTRA Transformers and Decision Trees0
Medication Regimen Extraction From Medical Conversations0
MEDSAGE: Enhancing Robustness of Medical Dialogue Summarization to ASR Errors with LLM-generated Synthetic Dialogues0
Melon Fruit Detection and Quality Assessment Using Generative AI-Based Image Data Augmentation0
Membership-Doctor: Comprehensive Assessment of Membership Inference Against Machine Learning Models0
Membership Privacy Evaluation in Deep Spiking Neural Networks0
Memorize or Generalize? Evaluating LLM Code Generation with Evolved Questions0
Memory-based Jitter: Improving Visual Recognition on Long-tailed Data with Diversity In Memory0
Memory Classifiers: Two-stage Classification for Robustness in Machine Learning0
"Mental Rotation" by Optimizing Transforming Distance0
Social Media as an Instant Source of Feedback on Water Quality0
Meta Approach to Data Augmentation Optimization0
MetaAugment: Sample-Aware Data Augmentation Policy Learning0
Meta Knowledge Distillation0
Meta-Learning in Audio and Speech Processing: An End to End Comprehensive Review0
Meta-Learning to Improve Pre-Training0
MetaMind Neural Machine Translation System for WMT 20160
MetaMix: Towards Corruption-Robust Continual Learning With Temporally Self-Adaptive Data Transformation0
MetaMixUp: Learning Adaptive Interpolation Policy of MixUp with Meta-Learning0
MetaTool: Facilitating Large Language Models to Master Tools with Meta-task Augmentation0
Meta-Transfer Derm-Diagnosis: Exploring Few-Shot Learning and Transfer Learning for Skin Disease Classification in Long-Tail Distribution0
Meta-tuning Loss Functions and Data Augmentation for Few-shot Object Detection0
SMILE: Speech Meta In-Context Learning for Low-Resource Language Automatic Speech Recognition0
Method and Dataset Entity Mining in Scientific Literature: A CNN + Bi-LSTM Model with Self-attention0
Show:102550
← PrevPage 256 of 336Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1DeiT-B (+MixPro)Accuracy (%)82.9Unverified
2ResNet-200 (DeepAA)Accuracy (%)81.32Unverified
3DeiT-S (+MixPro)Accuracy (%)81.3Unverified
4ResNet-200 (Fast AA)Accuracy (%)80.6Unverified
5ResNet-200 (UA)Accuracy (%)80.4Unverified
6ResNet-200 (AA)Accuracy (%)80Unverified
7ResNet-50 (DeepAA)Accuracy (%)78.3Unverified
8ResNet-50 (TA wide)Accuracy (%)78.07Unverified
9ResNet-50 (LoRot-E)Accuracy (%)77.72Unverified
10ResNet-50 (LoRot-I)Accuracy (%)77.71Unverified
#ModelMetricClaimedVerifiedStatus
1WideResNet-40-2 (Faster AA)Percentage error3.7Unverified
2Shake-Shake (26 2×32d) (Faster AA)Percentage error2.7Unverified
3WideResNet-28-10 (Faster AA)Percentage error2.6Unverified
4Shake-Shake (26 2×112d) (Faster AA)Percentage error2Unverified
5Shake-Shake (26 2×96d) (Faster AA)Percentage error2Unverified
#ModelMetricClaimedVerifiedStatus
1DiffAugClassification Accuracy92.7Unverified
2PaCMAPClassification Accuracy85.3Unverified
3hNNEClassification Accuracy77.4Unverified
4TopoAEClassification Accuracy74.6Unverified