SOTAVerified

Transfer Learning

Transfer Learning is a machine learning technique where a model trained on one task is re-purposed and fine-tuned for a related, but different task. The idea behind transfer learning is to leverage the knowledge learned from a pre-trained model to solve a new, but related problem. This can be useful in situations where there is limited data available to train a new model from scratch, or when the new task is similar enough to the original task that the pre-trained model can be adapted to the new problem with only minor modifications.

( Image credit: Subodh Malgonde )

Papers

Showing 96269650 of 10307 papers

TitleStatusHype
Investigation of Multimodal Features, Classifiers and Fusion Methods for Emotion RecognitionCode0
Island-Based Evolutionary Computation with Diverse Surrogates and Adaptive Knowledge Transfer for High-Dimensional Data-Driven OptimizationCode0
Is One Teacher Model Enough to Transfer Knowledge to a Student Model?Code0
Topology Only Pre-Training: Towards Generalised Multi-Domain Graph ModelsCode0
IV-tuning: Parameter-Efficient Transfer Learning for Infrared-Visible TasksCode0
Gotta Adapt 'Em All: Joint Pixel and Feature-Level Domain Adaptation for Recognition in the WildCode0
Joint Pre-training and Local Re-training: Transferable Representation Learning on Multi-source Knowledge GraphsCode0
Joint Training And Decoding for Multilingual End-to-End Simultaneous Speech TranslationCode0
Just Pick a Sign: Optimizing Deep Multitask Models with Gradient Sign DropoutCode0
KartalOl: Transfer learning using deep neural network for iris segmentation and localization: New dataset for iris segmentationCode0
Kernel computations from large-scale random features obtained by Optical Processing UnitsCode0
Kernel learning for visual perceptionCode0
KFU NLP Team at SMM4H 2020 Tasks: Cross-lingual Transfer Learning with Pretrained Language Models for Drug ReactionsCode0
KitchenScale: Learning to predict ingredient quantities from recipe contextsCode0
Knowledge-based Transfer Learning ExplanationCode0
Knowledge Distillation in RNN-Attention Models for Early Prediction of Student PerformanceCode0
Knowledge Distillation with Reptile Meta-Learning for Pretrained Language Model CompressionCode0
Knowledge from Large-Scale Protein Contact Prediction Models Can Be Transferred to the Data-Scarce RNA Contact Prediction TaskCode0
Knowledge Grafting of Large Language ModelsCode0
Knowledge-Guided Multiview Deep Curriculum Learning for Elbow Fracture ClassificationCode0
Knowledge Guided Semi-Supervised Learning for Quality Assessment of User Generated VideosCode0
Knowledge Mining and Transferring for Domain Adaptive Object DetectionCode0
Knowledge transfer across cell lines using Hybrid Gaussian Process models with entity embedding vectorsCode0
Knowledge Transfer Based Fine-grained Visual ClassificationCode0
Knowledge Transfer-Driven Few-Shot Class-Incremental LearningCode0
Show:102550
← PrevPage 386 of 413Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1APCLIPAccuracy84.2Unverified
2DFA-ENTAccuracy69.2Unverified
3DFA-SAFNAccuracy69.1Unverified
4EasyTLAccuracy63.3Unverified
5MEDAAccuracy60.3Unverified
#ModelMetricClaimedVerifiedStatus
1CNN10-20% Mask PSNR3.23Unverified
#ModelMetricClaimedVerifiedStatus
1Chatterjee, Dutta et al.[1]Accuracy96.12Unverified
#ModelMetricClaimedVerifiedStatus
1Co-TuningAccuracy85.65Unverified
#ModelMetricClaimedVerifiedStatus
1Physical AccessEER5.74Unverified
#ModelMetricClaimedVerifiedStatus
1riadd.aucmediAUROC0.95Unverified