SOTAVerified

Transfer Learning

Transfer Learning is a machine learning technique where a model trained on one task is re-purposed and fine-tuned for a related, but different task. The idea behind transfer learning is to leverage the knowledge learned from a pre-trained model to solve a new, but related problem. This can be useful in situations where there is limited data available to train a new model from scratch, or when the new task is similar enough to the original task that the pre-trained model can be adapted to the new problem with only minor modifications.

( Image credit: Subodh Malgonde )

Papers

Showing 701750 of 10307 papers

TitleStatusHype
Does Pretraining for Summarization Require Knowledge Transfer?Code1
DoG is SGD's Best Friend: A Parameter-Free Dynamic Step Size ScheduleCode1
Benchmarking Detection Transfer Learning with Vision TransformersCode1
DoubleU-Net: A Deep Convolutional Neural Network for Medical Image SegmentationCode1
DREAM+: Efficient Dataset Distillation by Bidirectional Representative MatchingCode1
Drug and Disease Interpretation Learning with Biomedical Entity Representation TransformerCode1
Duality Diagram Similarity: a generic framework for initialization selection in task transfer learningCode1
Dual-Teacher++: Exploiting Intra-domain and Inter-domain Knowledge with Reliable Transfer for Cardiac SegmentationCode1
Algorithmic encoding of protected characteristics in image-based models for disease detectionCode1
Enhanced Gaussian Process Dynamical Models with Knowledge Transfer for Long-term Battery Degradation ForecastingCode1
A Comprehensive Study on Torchvision Pre-trained Models for Fine-grained Inter-species ClassificationCode1
Bayesian Optimization with Automatic Prior Selection for Data-Efficient Direct Policy SearchCode1
EEG Channel Interpolation Using Deep Encoder-decoder NetwoksCode1
EEG-Reptile: An Automatized Reptile-Based Meta-Learning Library for BCIsCode1
Effect of Pre-Training Scale on Intra- and Inter-Domain Full and Few-Shot Transfer Learning for Natural and Medical X-Ray Chest ImagesCode1
Efficient Adaptation of Large Vision Transformer via Adapter Re-ComposingCode1
Efficient Crowd Counting via Structured Knowledge TransferCode1
Benchmarking and scaling of deep learning models for land cover image classificationCode1
Bert4XMR: Cross-Market Recommendation with Bidirectional Encoder Representations from TransformerCode1
An Empirical Analysis of Image-Based Learning Techniques for Malware ClassificationCode1
Efficient Training of Large Vision Models via Advanced Automated Progressive LearningCode1
Efficient transfer learning for NLP with ELECTRACode1
BadMerging: Backdoor Attacks Against Model MergingCode1
An Empirical Investigation of Model-to-Model Distribution Shifts in Trained Convolutional FiltersCode1
Emergent Communication Pretraining for Few-Shot Machine TranslationCode1
Unlocking Emergent Modularity in Large Language ModelsCode1
Empowering parameter-efficient transfer learning by recognizing the kernel structure in self-attentionCode1
Enabling Country-Scale Land Cover Mapping with Meter-Resolution Satellite ImageryCode1
An Empirical Study on Cross-X Transfer for Legal Judgment PredictionCode1
An Empirical Study on Large-Scale Multi-Label Text Classification Including Few and Zero-Shot LabelsCode1
Enhancement of price trend trading strategies via image-induced importance weightsCode1
Enhancing High-Resolution 3D Generation through Pixel-wise Gradient ClippingCode1
Enhancing Traffic Safety with Parallel Dense Video Captioning for End-to-End Event AnalysisCode1
Bag of Tricks for Image Classification with Convolutional Neural NetworksCode1
AVocaDo: Strategy for Adapting Vocabulary to Downstream DomainCode1
An Encoder-Decoder Based Audio Captioning System With Transfer and Reinforcement LearningCode1
A Visual Analytics Framework for Explaining and Diagnosing Transfer Learning ProcessesCode1
Evaluating Protein Transfer Learning with TAPECode1
An Ensemble Approach for Automated Theorem Proving Based on Efficient Name Invariant Graph Neural RepresentationsCode1
EViT: An Eagle Vision Transformer with Bi-Fovea Self-AttentionCode1
A Whisper transformer for audio captioning trained with synthetic captions and transfer learningCode1
Analyzing Redundancy in Pretrained Transformer ModelsCode1
A deep learning framework for solution and discovery in solid mechanicsCode1
Exploring Incompatible Knowledge Transfer in Few-shot Image GenerationCode1
BARThez: a Skilled Pretrained French Sequence-to-Sequence ModelCode1
Beyond Self-Supervision: A Simple Yet Effective Network Distillation Alternative to Improve BackbonesCode1
Boosting Weakly Supervised Object Detection with Progressive Knowledge TransferCode1
Exploring Transfer Learning for Low Resource Emotional TTSCode1
Automatic identification of segmentation errors for radiotherapy using geometric learningCode1
Adapting Pre-trained Vision Transformers from 2D to 3D through Weight Inflation Improves Medical Image SegmentationCode1
Show:102550
← PrevPage 15 of 207Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1APCLIPAccuracy84.2Unverified
2DFA-ENTAccuracy69.2Unverified
3DFA-SAFNAccuracy69.1Unverified
4EasyTLAccuracy63.3Unverified
5MEDAAccuracy60.3Unverified
#ModelMetricClaimedVerifiedStatus
1CNN10-20% Mask PSNR3.23Unverified
#ModelMetricClaimedVerifiedStatus
1Chatterjee, Dutta et al.[1]Accuracy96.12Unverified
#ModelMetricClaimedVerifiedStatus
1Co-TuningAccuracy85.65Unverified
#ModelMetricClaimedVerifiedStatus
1Physical AccessEER5.74Unverified
#ModelMetricClaimedVerifiedStatus
1riadd.aucmediAUROC0.95Unverified