SOTAVerified

Transfer Learning

Transfer Learning is a machine learning technique where a model trained on one task is re-purposed and fine-tuned for a related, but different task. The idea behind transfer learning is to leverage the knowledge learned from a pre-trained model to solve a new, but related problem. This can be useful in situations where there is limited data available to train a new model from scratch, or when the new task is similar enough to the original task that the pre-trained model can be adapted to the new problem with only minor modifications.

( Image credit: Subodh Malgonde )

Papers

Showing 92019225 of 10307 papers

TitleStatusHype
Did You Enjoy the Last Supper? An Experimental Study on Cross-Domain NER Models for the Art DomainCode0
Did you offend me? Classification of Offensive Tweets in Hinglish LanguageCode0
Pre-trained Perceptual Features Improve Differentially Private Image GenerationCode0
Differentially Private Image Classification from FeaturesCode0
Dimensionless Policies based on the Buckingham π Theorem: Is This a Good Way to Generalize Numerical Results?Code0
Direct multimodal few-shot learning of speech and imagesCode0
DiscoFuse: A Large-Scale Dataset for Discourse-Based Sentence FusionCode0
Discovering Phonetic Inventories with Crosslingual Automatic Speech RecognitionCode0
Discrete State-Action Abstraction via the Successor RepresentationCode0
Discriminability-Transferability Trade-Off: An Information-Theoretic PerspectiveCode0
Disease-informed Adaptation of Vision-Language ModelsCode0
Disease Knowledge Transfer across Neurodegenerative DiseasesCode0
Disentangled Contrastive Learning for Social RecommendationCode0
Disentangling and Mitigating the Impact of Task Similarity for Continual LearningCode0
Distilling Efficient Language-Specific Models for Cross-Lingual TransferCode0
Distilling from Similar Tasks for Transfer Learning on a BudgetCode0
Distilling Image Dehazing With Heterogeneous Task ImitationCode0
Distilling Knowledge for Designing Computational Imaging SystemsCode0
Text-Derived Knowledge Helps Vision: A Simple Cross-modal Distillation for Video-based Action AnticipationCode0
Distilling the Knowledge of Romanian BERTs Using Multiple TeachersCode0
Distilling Universal and Joint Knowledge for Cross-Domain Model Compression on Time Series DataCode0
Distinguishing Natural and Computer-Generated Images using Multi-Colorspace fused EfficientNetCode0
Distribution Matching for Self-Supervised Transfer LearningCode0
Learning Diverse Options via InfoMax Termination CriticCode0
Diverse Preference Augmentation with Multiple Domains for Cold-start RecommendationsCode0
Show:102550
← PrevPage 369 of 413Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1APCLIPAccuracy84.2Unverified
2DFA-ENTAccuracy69.2Unverified
3DFA-SAFNAccuracy69.1Unverified
4EasyTLAccuracy63.3Unverified
5MEDAAccuracy60.3Unverified
#ModelMetricClaimedVerifiedStatus
1CNN10-20% Mask PSNR3.23Unverified
#ModelMetricClaimedVerifiedStatus
1Chatterjee, Dutta et al.[1]Accuracy96.12Unverified
#ModelMetricClaimedVerifiedStatus
1Co-TuningAccuracy85.65Unverified
#ModelMetricClaimedVerifiedStatus
1Physical AccessEER5.74Unverified
#ModelMetricClaimedVerifiedStatus
1riadd.aucmediAUROC0.95Unverified