SOTAVerified

Transfer Learning

Transfer Learning is a machine learning technique where a model trained on one task is re-purposed and fine-tuned for a related, but different task. The idea behind transfer learning is to leverage the knowledge learned from a pre-trained model to solve a new, but related problem. This can be useful in situations where there is limited data available to train a new model from scratch, or when the new task is similar enough to the original task that the pre-trained model can be adapted to the new problem with only minor modifications.

( Image credit: Subodh Malgonde )

Papers

Showing 9511000 of 10307 papers

TitleStatusHype
Mosaicking to Distill: Knowledge Distillation from Out-of-Domain DataCode1
Motion Style Transfer: Modular Low-Rank Adaptation for Deep Motion ForecastingCode1
MoVi: A large multi-purpose human motion and video datasetCode1
A Simple Multi-Modality Transfer Learning Baseline for Sign Language TranslationCode1
Are You Stealing My Model? Sample Correlation for Fingerprinting Deep Neural NetworksCode1
MS-Net: Multi-Site Network for Improving Prostate Segmentation with Heterogeneous MRI DataCode1
MT3: Multi-Task Multitrack Music TranscriptionCode1
MToP: A MATLAB Optimization Platform for Evolutionary MultitaskingCode1
Boosted Neural Decoders: Achieving Extreme Reliability of LDPC Codes for 6G NetworksCode1
ArMATH: a Dataset for Solving Arabic Math Word ProblemsCode1
Multi-Domain Multilingual Question AnsweringCode1
Multi-domain Recommendation with Embedding Disentangling and Domain AlignmentCode1
Multi-Head Distillation for Continual Unsupervised Domain Adaptation in Semantic SegmentationCode1
MultiInstruct: Improving Multi-Modal Zero-Shot Learning via Instruction TuningCode1
Multi-level Knowledge Distillation via Knowledge Alignment and CorrelationCode1
Multilingual acoustic word embedding models for processing zero-resource languagesCode1
Breaking the Data Barrier -- Building GUI Agents Through Task GeneralizationCode1
ATTEMPT: Parameter-Efficient Multi-task Tuning via Attentional Mixtures of Soft PromptsCode1
Accurate Clinical Toxicity Prediction using Multi-task Deep Neural Nets and Contrastive Molecular ExplanationsCode1
Multi-Modality is All You Need for Transferable Recommender SystemsCode1
AdaBoost-CNN: An adaptive boosting algorithm for convolutional neural networks to classify multi-class imbalanced datasets using transfer learningCode1
MV-Adapter: Multimodal Video Transfer Learning for Video Text RetrievalCode1
ArtNeRF: A Stylized Neural Field for 3D-Aware Cartoonized Face SynthesisCode1
ARWKV: Pretrain is not what we need, an RNN-Attention-Based Language Model Born from TransformerCode1
MultiReQA: A Cross-Domain Evaluation forRetrieval Question Answering ModelsCode1
A Scalable and Generalizable Pathloss Map PredictionCode1
A Further Study of Unsupervised Pre-training for Transformer Based Speech RecognitionCode1
Multistream Gaze Estimation with Anatomical Eye Region Isolation by Synthetic to Real Transfer LearningCode1
A fuzzy distance-based ensemble of deep models for cervical cancer detectionCode1
Knowledge Transfer in Multi-Task Deep Reinforcement Learning for Continuous ControlCode1
aschern at SemEval-2020 Task 11: It Takes Three to Tango: RoBERTa, CRF, and Transfer LearningCode1
Multi-Task Multi-Scale Contrastive Knowledge Distillation for Efficient Medical Image SegmentationCode1
Bridging the User-side Knowledge Gap in Knowledge-aware Recommendations with Large Language ModelsCode1
Mutual Contrastive Learning for Visual Representation LearningCode1
CARLANE: A Lane Detection Benchmark for Unsupervised Domain Adaptation from Simulation to multiple Real-World DomainsCode1
NanoNet: Real-Time Polyp Segmentation in Video Capsule Endoscopy and ColonoscopyCode1
Neural Code Search Revisited: Enhancing Code Snippet Retrieval through Natural Language IntentCode1
Neural Eigenfunctions Are Structured Representation LearnersCode1
Adapting LLaMA Decoder to Vision TransformerCode1
Neural Priming for Sample-Efficient AdaptationCode1
AI4COVID-19: AI Enabled Preliminary Diagnosis for COVID-19 from Cough Samples via an AppCode1
BirdSAT: Cross-View Contrastive Masked Autoencoders for Bird Species Classification and MappingCode1
Non-binary deep transfer learning for image classificationCode1
Non-IID Transfer Learning on GraphsCode1
BioREx: Improving Biomedical Relation Extraction by Leveraging Heterogeneous DatasetsCode1
n-Reference Transfer Learning for Saliency PredictionCode1
A Simple and Effective Approach to Automatic Post-Editing with Transfer LearningCode1
A Simple and Effective Approach to Automatic Post-Editing with Transfer LearningCode1
Omnidirectional Transfer for Quasilinear Lifelong LearningCode1
BIOSCAN-5M: A Multimodal Dataset for Insect BiodiversityCode1
Show:102550
← PrevPage 20 of 207Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1APCLIPAccuracy84.2Unverified
2DFA-ENTAccuracy69.2Unverified
3DFA-SAFNAccuracy69.1Unverified
4EasyTLAccuracy63.3Unverified
5MEDAAccuracy60.3Unverified
#ModelMetricClaimedVerifiedStatus
1CNN10-20% Mask PSNR3.23Unverified
#ModelMetricClaimedVerifiedStatus
1Chatterjee, Dutta et al.[1]Accuracy96.12Unverified
#ModelMetricClaimedVerifiedStatus
1Co-TuningAccuracy85.65Unverified
#ModelMetricClaimedVerifiedStatus
1Physical AccessEER5.74Unverified
#ModelMetricClaimedVerifiedStatus
1riadd.aucmediAUROC0.95Unverified