SOTAVerified

Transfer Learning

Transfer Learning is a machine learning technique where a model trained on one task is re-purposed and fine-tuned for a related, but different task. The idea behind transfer learning is to leverage the knowledge learned from a pre-trained model to solve a new, but related problem. This can be useful in situations where there is limited data available to train a new model from scratch, or when the new task is similar enough to the original task that the pre-trained model can be adapted to the new problem with only minor modifications.

( Image credit: Subodh Malgonde )

Papers

Showing 84768500 of 10307 papers

TitleStatusHype
Virtual Knowledge Graph Construction for Zero-Shot Domain-Specific Document RetrievalCode0
Transfer String Kernel for Cross-Context DNA-Protein Binding PredictionCode0
Transfer Learning in Polyp and Endoscopic Tool Segmentation from Colonoscopy ImagesCode0
Transfer learning and subword sampling for asymmetric-resource one-to-many neural translationCode0
Transfer Learning and Distant Supervision for Multilingual Transformer Models: A Study on African LanguagesCode0
Transfer Learning in Latent Contextual Bandits with Covariate Shift Through Causal TransportabilityCode0
A Transferable Adaptive Domain Adversarial Neural Network for Virtual Reality Augmented EMG-Based Gesture RecognitionCode0
TransformCode: A Contrastive Learning Framework for Code Embedding via Subtree TransformationCode0
Transformer-Based Approaches for Automatic Music TranscriptionCode0
Transfer Learning Algorithm with Knowledge Division LevelCode0
Virtual to Real Reinforcement Learning for Autonomous DrivingCode0
Transfer Learning in Large-scale Gaussian Graphical Models with False Discovery Rate ControlCode0
Transfer Learning in Information Criteria-based Feature SelectionCode0
Transfer learning in hybrid classical-quantum neural networksCode0
Transfer Learning in ECG Diagnosis: Is It Effective?Code0
Tigrinya Neural Machine Translation with Transfer Learning for Humanitarian ResponseCode0
Unsupervised Representation Learning to Aid Semi-Supervised Meta LearningCode0
Vision-and-Language PretrainingCode0
We're Calling an Intervention: Exploring Fundamental Hurdles in Adapting Language Models to Nonstandard TextCode0
Transformer-CNN: Fast and Reliable tool for QSARCode0
To pretrain or not to pretrain? A case study of domain-specific pretraining for semantic segmentation in histopathologyCode0
Unsupervised Spike Depth Estimation via Cross-modality Cross-domain Knowledge TransferCode0
Transfer Learning in Deep Learning Models for Building Load Forecasting: Case of Limited DataCode0
Transformers as Algorithms: Generalization and Stability in In-context LearningCode0
Transfer Learning from Visual Speech Recognition to Mouthing Recognition in German Sign LanguageCode0
Show:102550
← PrevPage 340 of 413Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1APCLIPAccuracy84.2Unverified
2DFA-ENTAccuracy69.2Unverified
3DFA-SAFNAccuracy69.1Unverified
4EasyTLAccuracy63.3Unverified
5MEDAAccuracy60.3Unverified
#ModelMetricClaimedVerifiedStatus
1CNN10-20% Mask PSNR3.23Unverified
#ModelMetricClaimedVerifiedStatus
1Chatterjee, Dutta et al.[1]Accuracy96.12Unverified
#ModelMetricClaimedVerifiedStatus
1Co-TuningAccuracy85.65Unverified
#ModelMetricClaimedVerifiedStatus
1Physical AccessEER5.74Unverified
#ModelMetricClaimedVerifiedStatus
1riadd.aucmediAUROC0.95Unverified