SOTAVerified

Transfer Learning

Transfer Learning is a machine learning technique where a model trained on one task is re-purposed and fine-tuned for a related, but different task. The idea behind transfer learning is to leverage the knowledge learned from a pre-trained model to solve a new, but related problem. This can be useful in situations where there is limited data available to train a new model from scratch, or when the new task is similar enough to the original task that the pre-trained model can be adapted to the new problem with only minor modifications.

( Image credit: Subodh Malgonde )

Papers

Showing 99019925 of 10307 papers

TitleStatusHype
Neural Domain Adaptation for Biomedical Question AnsweringCode0
Neural Machine Translation For Low Resource LanguagesCode0
Neural Machine Translation of Clinical Text: An Empirical Investigation into Multilingual Pre-Trained Language Models and Transfer-LearningCode0
Neural Networks Regularization Through Representation LearningCode0
Neural Parameter Search for Slimmer Fine-Tuned Models and Better TransferCode0
Neural Stain-Style Transfer Learning using GAN for Histopathological ImagesCode0
Neural Subgraph Isomorphism CountingCode0
Neural Taskonomy: Inferring the Similarity of Task-Derived Representations from Brain ActivityCode0
Neuro-Symbolic Fusion of Wi-Fi Sensing Data for Passive Radar with Inter-Modal Knowledge TransferCode0
Neuro-symbolic Training for Reasoning over Spatial LanguageCode0
New Domain, Major Effort? How Much Data is Necessary to Adapt a Temporal Tagger to the Voice Assistant DomainCode0
NLP-based Feature Extraction for the Detection of COVID-19 Misinformation Videos on YouTubeCode0
Noise May Contain Transferable Knowledge: Understanding Semi-supervised Heterogeneous Domain Adaptation from an Empirical PerspectiveCode0
NollySenti: Leveraging Transfer Learning and Machine Translation for Nigerian Movie Sentiment ClassificationCode0
Non-asymptotic estimates for TUSLA algorithm for non-convex learning with applications to neural networks with ReLU activation functionCode0
Not All Attention is Needed: Parameter and Computation Efficient Transfer Learning for Multi-modal Large Language ModelsCode0
Not All Languages are Equal: Insights into Multilingual Retrieval-Augmented GenerationCode0
No Train but Gain: Language Arithmetic for training-free Language Adapters enhancementCode0
Novel Batch Active Learning Approach and Its Application to Synthetic Aperture Radar DatasetsCode0
Novel transfer learning schemes based on Siamese networks and synthetic dataCode0
NSF-MAP: Neurosymbolic Multimodal Fusion for Robust and Interpretable Anomaly Prediction in Assembly PipelinesCode0
NTUA-SLP at IEST 2018: Ensemble of Neural Transfer Methods for Implicit Emotion ClassificationCode0
NTUA-SLP at SemEval-2018 Task 1: Predicting Affective Content in Tweets with Deep Attentive RNNs and Transfer LearningCode0
Obeying the Order: Introducing Ordered Transfer Hyperparameter OptimisationCode0
Object discovery and representation networksCode0
Show:102550
← PrevPage 397 of 413Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1APCLIPAccuracy84.2Unverified
2DFA-ENTAccuracy69.2Unverified
3DFA-SAFNAccuracy69.1Unverified
4EasyTLAccuracy63.3Unverified
5MEDAAccuracy60.3Unverified
#ModelMetricClaimedVerifiedStatus
1CNN10-20% Mask PSNR3.23Unverified
#ModelMetricClaimedVerifiedStatus
1Chatterjee, Dutta et al.[1]Accuracy96.12Unverified
#ModelMetricClaimedVerifiedStatus
1Co-TuningAccuracy85.65Unverified
#ModelMetricClaimedVerifiedStatus
1Physical AccessEER5.74Unverified
#ModelMetricClaimedVerifiedStatus
1riadd.aucmediAUROC0.95Unverified