SOTAVerified

Transfer Learning

Transfer Learning is a machine learning technique where a model trained on one task is re-purposed and fine-tuned for a related, but different task. The idea behind transfer learning is to leverage the knowledge learned from a pre-trained model to solve a new, but related problem. This can be useful in situations where there is limited data available to train a new model from scratch, or when the new task is similar enough to the original task that the pre-trained model can be adapted to the new problem with only minor modifications.

( Image credit: Subodh Malgonde )

Papers

Showing 27762800 of 10307 papers

TitleStatusHype
Personalized Purchase Prediction of Market Baskets with Wasserstein-Based Sequence MatchingCode0
Facial Landmark Predictions with Applications to MetaverseCode0
Linking emotions to behaviors through deep transfer learningCode0
fairseq S2T: Fast Speech-to-Text Modeling with fairseqCode0
Deep Transfer Learning Based Downlink Channel Prediction for FDD Massive MIMO SystemsCode0
Cross-Institutional Transfer Learning for Educational Models: Implications for Model Performance, Fairness, and EquityCode0
Liver Fibrosis and NAS scoring from CT images using self-supervised learning and texture encodingCode0
Facial Beauty Analysis Using Distribution Prediction and CNN EnsemblesCode0
Facial Emotion Recognition Under Mask Coverage Using a Data Augmentation TechniqueCode0
Extreme Multi-Domain, Multi-Task Learning With Unified Text-to-Text Transfer TransformersCode0
Extracting temporal features into a spatial domain using autoencoders for sperm video analysisCode0
Extending LLMs to New Languages: A Case Study of Llama and Persian AdaptationCode0
Locality Preserving Joint Transfer for Domain AdaptationCode0
A transfer learning-based deep learning approach for automated COVID-19 diagnosis with audio dataCode0
Extracting and Analysing Metaphors in Migration Media Discourse: towards a Metaphor Annotation SchemeCode0
Facial Expression Recognition Under Partial Occlusion from Virtual Reality Headsets based on Transfer LearningCode0
Fantastic Gains and Where to Find Them: On the Existence and Prospect of General Knowledge Transfer between Any Pretrained ModelCode0
LoRA-PT: Low-Rank Adapting UNETR for Hippocampus Segmentation Using Principal Tensor Singular Values and VectorsCode0
Federated Continual Graph LearningCode0
Low-Dimensional Structure in the Space of Language Representations is Reflected in Brain ResponsesCode0
Low Latency Privacy Preserving InferenceCode0
GERNERMED++: Transfer Learning in German Medical NLPCode0
Improving Dialectal Slot and Intent Detection with Auxiliary Tasks: A Multi-Dialectal Bavarian Case StudyCode0
Exploring the Limits of Weakly Supervised PretrainingCode0
Exploring the potential of transfer learning for metamodels of heterogeneous material deformationCode0
Show:102550
← PrevPage 112 of 413Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1APCLIPAccuracy84.2Unverified
2DFA-ENTAccuracy69.2Unverified
3DFA-SAFNAccuracy69.1Unverified
4EasyTLAccuracy63.3Unverified
5MEDAAccuracy60.3Unverified
#ModelMetricClaimedVerifiedStatus
1CNN10-20% Mask PSNR3.23Unverified
#ModelMetricClaimedVerifiedStatus
1Chatterjee, Dutta et al.[1]Accuracy96.12Unverified
#ModelMetricClaimedVerifiedStatus
1Co-TuningAccuracy85.65Unverified
#ModelMetricClaimedVerifiedStatus
1Physical AccessEER5.74Unverified
#ModelMetricClaimedVerifiedStatus
1riadd.aucmediAUROC0.95Unverified