SOTAVerified

Transfer Learning

Transfer Learning is a machine learning technique where a model trained on one task is re-purposed and fine-tuned for a related, but different task. The idea behind transfer learning is to leverage the knowledge learned from a pre-trained model to solve a new, but related problem. This can be useful in situations where there is limited data available to train a new model from scratch, or when the new task is similar enough to the original task that the pre-trained model can be adapted to the new problem with only minor modifications.

( Image credit: Subodh Malgonde )

Papers

Showing 176200 of 10307 papers

TitleStatusHype
BirdSAT: Cross-View Contrastive Masked Autoencoders for Bird Species Classification and MappingCode1
Benchmarking Detection Transfer Learning with Vision TransformersCode1
Bert4XMR: Cross-Market Recommendation with Bidirectional Encoder Representations from TransformerCode1
Accurate Clinical Toxicity Prediction using Multi-task Deep Neural Nets and Contrastive Molecular ExplanationsCode1
Accuracy enhancement method for speech emotion recognition from spectrogram using temporal frequency correlation and positional information learning through knowledge transferCode1
Bayesian Optimization with Automatic Prior Selection for Data-Efficient Direct Policy SearchCode1
Beyond Self-Supervision: A Simple Yet Effective Network Distillation Alternative to Improve BackbonesCode1
BiToD: A Bilingual Multi-Domain Dataset For Task-Oriented Dialogue ModelingCode1
BadMerging: Backdoor Attacks Against Model MergingCode1
Bag of Tricks for Image Classification with Convolutional Neural NetworksCode1
Domain Prompt Learning for Efficiently Adapting CLIP to Unseen DomainsCode1
Amplifying Membership Exposure via Data PoisoningCode1
BARThez: a Skilled Pretrained French Sequence-to-Sequence ModelCode1
AMMUS : A Survey of Transformer-based Pretrained Models in Natural Language ProcessingCode1
2021 BEETL Competition: Advancing Transfer Learning for Subject Independence & Heterogenous EEG Data SetsCode1
AVocaDo: Strategy for Adapting Vocabulary to Downstream DomainCode1
Neuro2Semantic: A Transfer Learning Framework for Semantic Reconstruction of Continuous Language from Human Intracranial EEGCode1
Anatomical Foundation Models for Brain MRIsCode1
A Whisper transformer for audio captioning trained with synthetic captions and transfer learningCode1
Enhanced Gaussian Process Dynamical Models with Knowledge Transfer for Long-term Battery Degradation ForecastingCode1
BlackVIP: Black-Box Visual Prompting for Robust Transfer LearningCode1
Automatic Noise Filtering with Dynamic Sparse Training in Deep Reinforcement LearningCode1
AutoTune: Automatically Tuning Convolutional Neural Networks for Improved Transfer LearningCode1
Aligning Pretraining for Detection via Object-Level Contrastive LearningCode1
SentenceMIM: A Latent Variable Language ModelCode1
Show:102550
← PrevPage 8 of 413Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1APCLIPAccuracy84.2Unverified
2DFA-ENTAccuracy69.2Unverified
3DFA-SAFNAccuracy69.1Unverified
4EasyTLAccuracy63.3Unverified
5MEDAAccuracy60.3Unverified
#ModelMetricClaimedVerifiedStatus
1CNN10-20% Mask PSNR3.23Unverified
#ModelMetricClaimedVerifiedStatus
1Chatterjee, Dutta et al.[1]Accuracy96.12Unverified
#ModelMetricClaimedVerifiedStatus
1Co-TuningAccuracy85.65Unverified
#ModelMetricClaimedVerifiedStatus
1Physical AccessEER5.74Unverified
#ModelMetricClaimedVerifiedStatus
1riadd.aucmediAUROC0.95Unverified