SOTAVerified

Transfer Learning

Transfer Learning is a machine learning technique where a model trained on one task is re-purposed and fine-tuned for a related, but different task. The idea behind transfer learning is to leverage the knowledge learned from a pre-trained model to solve a new, but related problem. This can be useful in situations where there is limited data available to train a new model from scratch, or when the new task is similar enough to the original task that the pre-trained model can be adapted to the new problem with only minor modifications.

( Image credit: Subodh Malgonde )

Papers

Showing 22512275 of 10307 papers

TitleStatusHype
Generalizable Local Feature Pre-training for Deformable Shape AnalysisCode0
Generalized Adaptive Transfer Network: Enhancing Transfer Learning in Reinforcement Learning Across DomainsCode0
Generalized Funnelling: Ensemble Learning and Heterogeneous Document Embeddings for Cross-Lingual Text ClassificationCode0
BatStyler: Advancing Multi-category Style Generation for Source-free Domain GeneralizationCode0
BatSort: Enhanced Battery Classification with Transfer Learning for Battery Sorting and RecyclingCode0
GAN pretraining for deep convolutional autoencoders applied to Software-based Fingerprint Presentation Attack DetectionCode0
An Attention-based Representation Distillation Baseline for Multi-Label Continual LearningCode0
GANTL: Towards Practical and Real-Time Topology Optimization with Conditional GANs and Transfer LearningCode0
Addressee and Response Selection for Multilingual ConversationCode0
Gated Domain Units for Multi-source Domain GeneralizationCode0
BanglaNLP at BLP-2023 Task 2: Benchmarking different Transformer Models for Sentiment Analysis of Bangla Social Media PostsCode0
Anatomy of Neural Language ModelsCode0
Gammatonegram Representation for End-to-End Dysarthric Speech Processing Tasks: Speech Recognition, Speaker Identification, and Intelligibility AssessmentCode0
Anatomy-Aware Contrastive Representation Learning for Fetal UltrasoundCode0
GAN Cocktail: mixing GANs without dataset accessCode0
Context selectivity with dynamic availability enables lifelong continual learningCode0
Funnelling: A New Ensemble Method for Heterogeneous Transfer Learning and its Application to Cross-Lingual Text ClassificationCode0
FUSE-ing Language Models: Zero-Shot Adapter Discovery for Prompt Optimization Across TokenizersCode0
Functional Knowledge Transfer with Self-supervised Representation LearningCode0
FUSE: Label-Free Image-Event Joint Monocular Depth Estimation via Frequency-Decoupled Alignment and Degradation-Robust FusionCode0
FTA-FTL: A Fine-Tuned Aggregation Federated Transfer Learning Scheme for Lithology Microscopic Image ClassificationCode0
FTL: Transfer Learning Nonlinear Plasma Dynamic Transitions in Low Dimensional Embeddings via Deep Neural NetworksCode0
Balanced joint maximum mean discrepancy for deep transfer learningCode0
Unsupervised Representation Learning by Balanced Self Attention MatchingCode0
Entity Tracking via Effective Use of Multi-Task Learning Model and Mention-guided DecodingCode0
Show:102550
← PrevPage 91 of 413Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1APCLIPAccuracy84.2Unverified
2DFA-ENTAccuracy69.2Unverified
3DFA-SAFNAccuracy69.1Unverified
4EasyTLAccuracy63.3Unverified
5MEDAAccuracy60.3Unverified
#ModelMetricClaimedVerifiedStatus
1CNN10-20% Mask PSNR3.23Unverified
#ModelMetricClaimedVerifiedStatus
1Chatterjee, Dutta et al.[1]Accuracy96.12Unverified
#ModelMetricClaimedVerifiedStatus
1Co-TuningAccuracy85.65Unverified
#ModelMetricClaimedVerifiedStatus
1Physical AccessEER5.74Unverified
#ModelMetricClaimedVerifiedStatus
1riadd.aucmediAUROC0.95Unverified