SOTAVerified

Transfer Learning

Transfer Learning is a machine learning technique where a model trained on one task is re-purposed and fine-tuned for a related, but different task. The idea behind transfer learning is to leverage the knowledge learned from a pre-trained model to solve a new, but related problem. This can be useful in situations where there is limited data available to train a new model from scratch, or when the new task is similar enough to the original task that the pre-trained model can be adapted to the new problem with only minor modifications.

( Image credit: Subodh Malgonde )

Papers

Showing 88018825 of 10307 papers

TitleStatusHype
Supervised Transfer Learning for Product Information Question Answering0
Towards Explainable, Privacy-Preserved Human-Motion Affect Recognition0
Pre-text Representation Transfer for Deep Learning with Limited Imbalanced Data : Application to CT-based COVID-19 Detection0
Supervised Transfer Learning Framework for Fault Diagnosis in Wind Turbines0
A Deeper Look at 3D Shape Classifiers0
Supervised Understanding of Word Embeddings0
Pretrained language model transfer on neural named entity recognition in Indonesian conversational texts0
Pre-Trained Model Recommendation for Downstream Fine-tuning0
A deep convolutional neural network for classification of Aedes albopictus mosquitoes0
Pre-Trained Models: Past, Present and Future0
A Deep Analysis of Transfer Learning Based Breast Cancer Detection Using Histopathology Images0
Pre-trained Word Embeddings for Goal-conditional Transfer Learning in Reinforcement Learning0
A decision framework for selecting information-transfer strategies in population-based SHM0
Pre-training Auto-regressive Robotic Models with 4D Representations0
Pretraining boosts out-of-domain robustness for pose estimation0
Addressing word-order Divergence in Multilingual Neural Machine Translation for extremely Low Resource Languages0
Addressing the Challenges of Cross-Lingual Hate Speech Detection0
Pretraining for Conditional Generation with Pseudo Self Attention0
Pre-Training Graph Contrastive Masked Autoencoders are Strong Distillers for EEG0
Addressing modern and practical challenges in machine learning: A survey of online federated and transfer learning0
Pre-training Tensor-Train Networks Facilitates Machine Learning with Variational Quantum Circuits0
Pre-training Text-to-Text Transformers to Write and Reason with Concepts0
Pre-training transformer-based framework on large-scale pediatric claims data for downstream population-specific tasks0
Pre-training Transformers for Molecular Property Prediction Using Reaction Prediction0
Supervising the Transfer of Reasoning Patterns in VQA0
Show:102550
← PrevPage 353 of 413Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1APCLIPAccuracy84.2Unverified
2DFA-ENTAccuracy69.2Unverified
3DFA-SAFNAccuracy69.1Unverified
4EasyTLAccuracy63.3Unverified
5MEDAAccuracy60.3Unverified
#ModelMetricClaimedVerifiedStatus
1CNN10-20% Mask PSNR3.23Unverified
#ModelMetricClaimedVerifiedStatus
1Chatterjee, Dutta et al.[1]Accuracy96.12Unverified
#ModelMetricClaimedVerifiedStatus
1Co-TuningAccuracy85.65Unverified
#ModelMetricClaimedVerifiedStatus
1Physical AccessEER5.74Unverified
#ModelMetricClaimedVerifiedStatus
1riadd.aucmediAUROC0.95Unverified