SOTAVerified

Transfer Learning

Transfer Learning is a machine learning technique where a model trained on one task is re-purposed and fine-tuned for a related, but different task. The idea behind transfer learning is to leverage the knowledge learned from a pre-trained model to solve a new, but related problem. This can be useful in situations where there is limited data available to train a new model from scratch, or when the new task is similar enough to the original task that the pre-trained model can be adapted to the new problem with only minor modifications.

( Image credit: Subodh Malgonde )

Papers

Showing 70517075 of 10307 papers

TitleStatusHype
Learning to diagnose cirrhosis from radiological and histological labels with joint self and weakly-supervised pretraining strategies0
On the Generalization for Transfer Learning: An Information-Theoretic Analysis0
Learning to Evaluate Translation Beyond English: BLEURT Submissions to the WMT Metrics 2020 Shared Task0
Simple Semantic Annotation and Situation Frames: Two Approaches to Basic Text Understanding in LORELEI0
Learning to Generate Textual Data0
Learning to Harmonize Cross-vendor X-ray Images by Non-linear Image Dynamics Correction0
Learning to Learn, from Transfer Learning to Domain Adaptation: A Unifying Perspective0
Learning to Learn: How to Continuously Teach Humans and Machines0
Learning to Learn: Meta-Critic Networks for Sample Efficient Learning0
Learning to Learn Unlearned Feature for Brain Tumor Segmentation0
Learning to Learn Weight Generation via Local Consistency Diffusion0
An Inductive Transfer Learning Approach using Cycle-consistent Adversarial Domain Adaptation with Application to Brain Tumor Segmentation0
Learning to Model the Tail0
Learning to Profile: User Meta-Profile Network for Few-Shot Learning0
A Comparative Study on Transfer Learning and Distance Metrics in Semantic Clustering over the COVID-19 Tweets0
Learning to Progressively Recognize New Named Entities with Sequence to Sequence Models0
Learning to Project for Cross-Task Knowledge Distillation0
An Improvement for Capsule Networks using Depthwise Separable Convolution0
Text-to-Speech for Under-Resourced Languages: Phoneme Mapping and Source Language Selection in Transfer Learning0
Learning to Rank based on Analogical Reasoning0
Learning to Rank Learning Curves0
Beyond Fine-tuning: Few-Sample Sentence Embedding Transfer0
Learning to Retain while Acquiring: Combating Distribution-Shift in Adversarial Data-Free Knowledge Distillation0
Learning to rumble: Automated elephant call classification, detection and endpointing using deep architectures0
Learning to search for and detect objects in foveal images using deep learning0
Show:102550
← PrevPage 283 of 413Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1APCLIPAccuracy84.2Unverified
2DFA-ENTAccuracy69.2Unverified
3DFA-SAFNAccuracy69.1Unverified
4EasyTLAccuracy63.3Unverified
5MEDAAccuracy60.3Unverified
#ModelMetricClaimedVerifiedStatus
1CNN10-20% Mask PSNR3.23Unverified
#ModelMetricClaimedVerifiedStatus
1Chatterjee, Dutta et al.[1]Accuracy96.12Unverified
#ModelMetricClaimedVerifiedStatus
1Co-TuningAccuracy85.65Unverified
#ModelMetricClaimedVerifiedStatus
1Physical AccessEER5.74Unverified
#ModelMetricClaimedVerifiedStatus
1riadd.aucmediAUROC0.95Unverified