SOTAVerified

Transfer Learning

Transfer Learning is a machine learning technique where a model trained on one task is re-purposed and fine-tuned for a related, but different task. The idea behind transfer learning is to leverage the knowledge learned from a pre-trained model to solve a new, but related problem. This can be useful in situations where there is limited data available to train a new model from scratch, or when the new task is similar enough to the original task that the pre-trained model can be adapted to the new problem with only minor modifications.

( Image credit: Subodh Malgonde )

Papers

Showing 93519375 of 10307 papers

TitleStatusHype
Exploring Driving-aware Salient Object Detection via Knowledge TransferCode0
Exploring Large Language Models and Hierarchical Frameworks for Classification of Large Unstructured Legal DocumentsCode0
Exploring Methods for Building Dialects-Mandarin Code-Mixing Corpora: A Case Study in Taiwanese HokkienCode0
Exploring Model Transferability through the Lens of Potential EnergyCode0
Exploring Multilingual Syntactic Sentence RepresentationsCode0
Exploring object-centric and scene-centric CNN features and their complementarity for human rights violations recognition in imagesCode0
Exploring Open-world Continual Learning with Knowns-Unknowns Knowledge TransferCode0
Exploring Pre-Trained Transformers and Bilingual Transfer Learning for Arabic Coreference ResolutionCode0
Exploring Self-Supervised Representation Learning For Low-Resource Medical Image AnalysisCode0
Leveraging Cross-Lingual Transfer Learning in Spoken Named Entity Recognition SystemsCode0
Exploring Target Representations for Masked AutoencodersCode0
Exploring the Benefits of Differentially Private Pre-training and Parameter-Efficient Fine-tuning for Table TransformersCode0
Exploring the Benefits of Visual Prompting in Differential PrivacyCode0
Exploring the Effectiveness and Consistency of Task Selection in Intermediate-Task Transfer LearningCode0
Exploring the Limits of Weakly Supervised PretrainingCode0
Exploring the potential of transfer learning for metamodels of heterogeneous material deformationCode0
Exploring the Robustness of Task-oriented Dialogue Systems for Colloquial German VarietiesCode0
Transformers on Multilingual Clause-Level MorphologyCode0
Speech foundation models in healthcare: Effect of layer selection on pathological speech feature predictionCode0
Exploring User Retrieval Integration towards Large Language Models for Cross-Domain Sequential RecommendationCode0
Extending LLMs to New Languages: A Case Study of Llama and Persian AdaptationCode0
Extracting and Analysing Metaphors in Migration Media Discourse: towards a Metaphor Annotation SchemeCode0
Extracting temporal features into a spatial domain using autoencoders for sperm video analysisCode0
Extreme Multi-Domain, Multi-Task Learning With Unified Text-to-Text Transfer TransformersCode0
Facial Beauty Analysis Using Distribution Prediction and CNN EnsemblesCode0
Show:102550
← PrevPage 375 of 413Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1APCLIPAccuracy84.2Unverified
2DFA-ENTAccuracy69.2Unverified
3DFA-SAFNAccuracy69.1Unverified
4EasyTLAccuracy63.3Unverified
5MEDAAccuracy60.3Unverified
#ModelMetricClaimedVerifiedStatus
1CNN10-20% Mask PSNR3.23Unverified
#ModelMetricClaimedVerifiedStatus
1Chatterjee, Dutta et al.[1]Accuracy96.12Unverified
#ModelMetricClaimedVerifiedStatus
1Co-TuningAccuracy85.65Unverified
#ModelMetricClaimedVerifiedStatus
1Physical AccessEER5.74Unverified
#ModelMetricClaimedVerifiedStatus
1riadd.aucmediAUROC0.95Unverified