SOTAVerified

Transfer Learning

Transfer Learning is a machine learning technique where a model trained on one task is re-purposed and fine-tuned for a related, but different task. The idea behind transfer learning is to leverage the knowledge learned from a pre-trained model to solve a new, but related problem. This can be useful in situations where there is limited data available to train a new model from scratch, or when the new task is similar enough to the original task that the pre-trained model can be adapted to the new problem with only minor modifications.

( Image credit: Subodh Malgonde )

Papers

Showing 32513275 of 10307 papers

TitleStatusHype
Distilling BlackBox to Interpretable models for Efficient Transfer LearningCode1
Live American Sign Language Letter Classification with Convolutional Neural Networks0
Do We Really Need a Large Number of Visual Prompts?0
A Multi-Resolution Physics-Informed Recurrent Neural Network: Formulation and Application to Musculoskeletal Systems0
BiomedGPT: A Generalist Vision-Language Foundation Model for Diverse Biomedical TasksCode2
Representation Transfer Learning via Multiple Pre-trained models for Linear Regression0
Collective Knowledge Graph Completion with Mutual Knowledge Distillation0
Transfer Learning for Personality Perception via Speech Emotion Recognition0
ComSL: A Composite Speech-Language Model for End-to-End Speech-to-Text TranslationCode1
TOAST: Transfer Learning via Attention SteeringCode1
Making Offline RL Online: Collaborative World Models for Offline Visual Reinforcement LearningCode1
READ: Recurrent Adaptation of Large Transformers0
Bert4XMR: Cross-Market Recommendation with Bidirectional Encoder Representations from TransformerCode1
Deep Learning-based Bio-Medical Image Segmentation using UNet Architecture and Transfer Learning0
Exploring Adapter-based Transfer Learning for Recommender Systems: Empirical Studies and Practical InsightsCode1
Improving few-shot learning-based protein engineering with evolutionary samplingCode1
Few-shot Unified Question Answering: Tuning Models or Prompts?0
A Two-Step Deep Learning Method for 3DCT-2DUS Kidney Registration During Breathing0
Beyond Shared Vocabulary: Increasing Representational Word Similarities across Languages for Multilingual Machine TranslationCode0
Amplitude-Independent Machine Learning for PPG through Visibility Graphs and Transfer Learning0
Selective Pre-training for Private Fine-tuningCode0
Cross-lingual Knowledge Transfer and Iterative Pseudo-labeling for Low-Resource Speech Recognition with Transducers0
CREATOR: Tool Creation for Disentangling Abstract and Concrete Reasoning of Large Language ModelsCode1
Topic-driven Distant Supervision Framework for Macro-level Discourse Parsing0
Deep Transductive Transfer Learning for Automatic Target Recognition0
Show:102550
← PrevPage 131 of 413Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1APCLIPAccuracy84.2Unverified
2DFA-ENTAccuracy69.2Unverified
3DFA-SAFNAccuracy69.1Unverified
4EasyTLAccuracy63.3Unverified
5MEDAAccuracy60.3Unverified
#ModelMetricClaimedVerifiedStatus
1CNN10-20% Mask PSNR3.23Unverified
#ModelMetricClaimedVerifiedStatus
1Chatterjee, Dutta et al.[1]Accuracy96.12Unverified
#ModelMetricClaimedVerifiedStatus
1Co-TuningAccuracy85.65Unverified
#ModelMetricClaimedVerifiedStatus
1Physical AccessEER5.74Unverified
#ModelMetricClaimedVerifiedStatus
1riadd.aucmediAUROC0.95Unverified