SOTAVerified

Transfer Learning

Transfer Learning is a machine learning technique where a model trained on one task is re-purposed and fine-tuned for a related, but different task. The idea behind transfer learning is to leverage the knowledge learned from a pre-trained model to solve a new, but related problem. This can be useful in situations where there is limited data available to train a new model from scratch, or when the new task is similar enough to the original task that the pre-trained model can be adapted to the new problem with only minor modifications.

( Image credit: Subodh Malgonde )

Papers

Showing 451475 of 10307 papers

TitleStatusHype
Confidence-Aware Multi-Teacher Knowledge DistillationCode1
A Whisper transformer for audio captioning trained with synthetic captions and transfer learningCode1
BadMerging: Backdoor Attacks Against Model MergingCode1
Bag of Tricks for Image Classification with Convolutional Neural NetworksCode1
2021 BEETL Competition: Advancing Transfer Learning for Subject Independence & Heterogenous EEG Data SetsCode1
Uncovering the Connections Between Adversarial Transferability and Knowledge TransferabilityCode1
Enhanced Gaussian Process Dynamical Models with Knowledge Transfer for Long-term Battery Degradation ForecastingCode1
Does Pretraining for Summarization Require Knowledge Transfer?Code1
An Improved Person Re-identification Method by light-weight convolutional neural networkCode1
Domain Adaptation with Invariant Representation Learning: What Transformations to Learn?Code1
Anomaly Detection in Time Series with Triadic Motif Fields and Application in Atrial Fibrillation ECG ClassificationCode1
Bayesian Optimization with Automatic Prior Selection for Data-Efficient Direct Policy SearchCode1
Confidence-based Visual Dispersal for Few-shot Unsupervised Domain AdaptationCode1
Do Vision Transformers See Like Convolutional Neural Networks?Code1
Bert4XMR: Cross-Market Recommendation with Bidirectional Encoder Representations from TransformerCode1
Drug and Disease Interpretation Learning with Biomedical Entity Representation TransformerCode1
Continual Learning with Knowledge Transfer for Sentiment ClassificationCode1
Affordance Transfer Learning for Human-Object Interaction DetectionCode1
Active Transfer Learning for Efficient Video-Specific Human Pose EstimationCode1
Dual-Teacher++: Exploiting Intra-domain and Inter-domain Knowledge with Reliable Transfer for Cardiac SegmentationCode1
COVID-19 detection from scarce chest x-ray image data using few-shot deep learning approachCode1
An Empirical Study of Pre-trained Transformers for Arabic Information ExtractionCode1
DDAM-PS: Diligent Domain Adaptive Mixer for Person SearchCode1
EasyTransfer -- A Simple and Scalable Deep Transfer Learning Platform for NLP ApplicationsCode1
An Evaluation of Self-Supervised Pre-Training for Skin-Lesion AnalysisCode1
Show:102550
← PrevPage 19 of 413Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1APCLIPAccuracy84.2Unverified
2DFA-ENTAccuracy69.2Unverified
3DFA-SAFNAccuracy69.1Unverified
4EasyTLAccuracy63.3Unverified
5MEDAAccuracy60.3Unverified
#ModelMetricClaimedVerifiedStatus
1CNN10-20% Mask PSNR3.23Unverified
#ModelMetricClaimedVerifiedStatus
1Chatterjee, Dutta et al.[1]Accuracy96.12Unverified
#ModelMetricClaimedVerifiedStatus
1Co-TuningAccuracy85.65Unverified
#ModelMetricClaimedVerifiedStatus
1Physical AccessEER5.74Unverified
#ModelMetricClaimedVerifiedStatus
1riadd.aucmediAUROC0.95Unverified