SOTAVerified

Transfer Learning

Transfer Learning is a machine learning technique where a model trained on one task is re-purposed and fine-tuned for a related, but different task. The idea behind transfer learning is to leverage the knowledge learned from a pre-trained model to solve a new, but related problem. This can be useful in situations where there is limited data available to train a new model from scratch, or when the new task is similar enough to the original task that the pre-trained model can be adapted to the new problem with only minor modifications.

( Image credit: Subodh Malgonde )

Papers

Showing 501525 of 10307 papers

TitleStatusHype
FoPro-KD: Fourier Prompted Effective Knowledge Distillation for Long-Tailed Medical Image RecognitionCode1
Matrix Information Theory for Self-Supervised LearningCode1
Distilling BlackBox to Interpretable models for Efficient Transfer LearningCode1
Making Offline RL Online: Collaborative World Models for Offline Visual Reinforcement LearningCode1
Exploring Adapter-based Transfer Learning for Recommender Systems: Empirical Studies and Practical InsightsCode1
Bert4XMR: Cross-Market Recommendation with Bidirectional Encoder Representations from TransformerCode1
ComSL: A Composite Speech-Language Model for End-to-End Speech-to-Text TranslationCode1
TOAST: Transfer Learning via Attention SteeringCode1
CREATOR: Tool Creation for Disentangling Abstract and Concrete Reasoning of Large Language ModelsCode1
Improving few-shot learning-based protein engineering with evolutionary samplingCode1
MetaAdapt: Domain Adaptive Few-Shot Misinformation Detection via Meta LearningCode1
Revisiting pre-trained remote sensing model benchmarks: resizing and normalization mattersCode1
Denoised Self-Augmented Learning for Social RecommendationCode1
PromptNER: A Prompting Method for Few-shot Named Entity Recognition via k Nearest Neighbor SearchCode1
PTGB: Pre-Train Graph Neural Networks for Brain Network AnalysisCode1
Efficient ConvBN Blocks for Transfer Learning and BeyondCode1
One-Prompt to Segment All Medical ImagesCode1
AD-KD: Attribution-Driven Knowledge Distillation for Language Model CompressionCode1
Real-Time Flying Object Detection with YOLOv8Code1
Tailoring Instructions to Student's Learning Levels Boosts Knowledge DistillationCode1
CLIP-VG: Self-paced Curriculum Adapting of CLIP for Visual GroundingCode1
An Ensemble Approach for Automated Theorem Proving Based on Efficient Name Invariant Graph Neural RepresentationsCode1
A Whisper transformer for audio captioning trained with synthetic captions and transfer learningCode1
Serial Contrastive Knowledge Distillation for Continual Few-shot Relation ExtractionCode1
Improving Implicit Feedback-Based Recommendation through Multi-Behavior AlignmentCode1
Show:102550
← PrevPage 21 of 413Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1APCLIPAccuracy84.2Unverified
2DFA-ENTAccuracy69.2Unverified
3DFA-SAFNAccuracy69.1Unverified
4EasyTLAccuracy63.3Unverified
5MEDAAccuracy60.3Unverified
#ModelMetricClaimedVerifiedStatus
1CNN10-20% Mask PSNR3.23Unverified
#ModelMetricClaimedVerifiedStatus
1Chatterjee, Dutta et al.[1]Accuracy96.12Unverified
#ModelMetricClaimedVerifiedStatus
1Co-TuningAccuracy85.65Unverified
#ModelMetricClaimedVerifiedStatus
1Physical AccessEER5.74Unverified
#ModelMetricClaimedVerifiedStatus
1riadd.aucmediAUROC0.95Unverified