SOTAVerified

Transfer Learning

Transfer Learning is a machine learning technique where a model trained on one task is re-purposed and fine-tuned for a related, but different task. The idea behind transfer learning is to leverage the knowledge learned from a pre-trained model to solve a new, but related problem. This can be useful in situations where there is limited data available to train a new model from scratch, or when the new task is similar enough to the original task that the pre-trained model can be adapted to the new problem with only minor modifications.

( Image credit: Subodh Malgonde )

Papers

Showing 651675 of 10307 papers

TitleStatusHype
A Realistic Evaluation of Semi-Supervised Learning for Fine-Grained ClassificationCode1
Dual-Teacher++: Exploiting Intra-domain and Inter-domain Knowledge with Reliable Transfer for Cardiac SegmentationCode1
AReLU: Attention-based Rectified Linear UnitCode1
Automatic identification of segmentation errors for radiotherapy using geometric learningCode1
Deep Boosting Learning: A Brand-new Cooperative Approach for Image-Text MatchingCode1
Global Self-Attention as a Replacement for Graph ConvolutionCode1
EEG-Reptile: An Automatized Reptile-Based Meta-Learning Library for BCIsCode1
EENLP: Cross-lingual Eastern European NLP IndexCode1
Auxiliary Signal-Guided Knowledge Encoder-Decoder for Medical Report GenerationCode1
AutoTune: Automatically Tuning Convolutional Neural Networks for Improved Transfer LearningCode1
A Data-Based Perspective on Transfer LearningCode1
Efficient Adaptation of Large Vision Transformer via Adapter Re-ComposingCode1
Data Mining in Clinical Trial Text: Transformers for Classification and Question Answering TasksCode1
A Visual Analytics Framework for Explaining and Diagnosing Transfer Learning ProcessesCode1
Analysis of skin lesion images with deep learningCode1
AVocaDo: Strategy for Adapting Vocabulary to Downstream DomainCode1
A Whisper transformer for audio captioning trained with synthetic captions and transfer learningCode1
BadMerging: Backdoor Attacks Against Model MergingCode1
AraT5: Text-to-Text Transformers for Arabic Language GenerationCode1
DDAM-PS: Diligent Domain Adaptive Mixer for Person SearchCode1
Learning Efficient Vision Transformers via Fine-Grained Manifold DistillationCode1
BARThez: a Skilled Pretrained French Sequence-to-Sequence ModelCode1
AquilaMoE: Efficient Training for MoE Models with Scale-Up and Scale-Out StrategiesCode1
EffiSegNet: Gastrointestinal Polyp Segmentation through a Pre-Trained EfficientNet-based Network with a Simplified DecoderCode1
ArMATH: a Dataset for Solving Arabic Math Word ProblemsCode1
Show:102550
← PrevPage 27 of 413Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1APCLIPAccuracy84.2Unverified
2DFA-ENTAccuracy69.2Unverified
3DFA-SAFNAccuracy69.1Unverified
4EasyTLAccuracy63.3Unverified
5MEDAAccuracy60.3Unverified
#ModelMetricClaimedVerifiedStatus
1CNN10-20% Mask PSNR3.23Unverified
#ModelMetricClaimedVerifiedStatus
1Chatterjee, Dutta et al.[1]Accuracy96.12Unverified
#ModelMetricClaimedVerifiedStatus
1Co-TuningAccuracy85.65Unverified
#ModelMetricClaimedVerifiedStatus
1Physical AccessEER5.74Unverified
#ModelMetricClaimedVerifiedStatus
1riadd.aucmediAUROC0.95Unverified