SOTAVerified

Transfer Learning

Transfer Learning is a machine learning technique where a model trained on one task is re-purposed and fine-tuned for a related, but different task. The idea behind transfer learning is to leverage the knowledge learned from a pre-trained model to solve a new, but related problem. This can be useful in situations where there is limited data available to train a new model from scratch, or when the new task is similar enough to the original task that the pre-trained model can be adapted to the new problem with only minor modifications.

( Image credit: Subodh Malgonde )

Papers

Showing 48014850 of 10307 papers

TitleStatusHype
Target Aware Network Architecture Search and Compression for Efficient Knowledge TransferCode0
AutoKE: An automatic knowledge embedding framework for scientific machine learningCode1
ReFine: Re-randomization before Fine-tuning for Cross-domain Few-shot Learning0
Automatic Tuberculosis and COVID-19 cough classification using deep learning0
CoDo: Contrastive Learning with Downstream Background Invariance for Detection0
Object Detection in Indian Food Platters using Transfer Learning with YOLOv40
Long-term stability and generalization of observationally-constrained stochastic data-driven models for geophysical turbulenceCode0
Sub-Word Alignment Is Still Useful: A Vest-Pocket Method for Enhancing Low-Resource Machine TranslationCode0
An Effective Scheme for Maize Disease Recognition based on Deep Networks0
ProQA: Structural Prompt-based Pre-training for Unified Question AnsweringCode1
Transfer Learning Based Efficient Traffic Prediction with Limited Training Data0
On the Use of BERT for Automated Essay Scoring: Joint Learning of Multi-Scale Essay RepresentationCode1
Data-Free Adversarial Knowledge Distillation for Graph Neural Networks0
Training from Zero: Radio Frequency Machine Learning Data Quantity Forecasting0
Keratoconus Classifier for Smartphone-based Corneal Topographer0
Empowering parameter-efficient transfer learning by recognizing the kernel structure in self-attentionCode1
Label-aware Multi-level Contrastive Learning for Cross-lingual Spoken Language Understanding0
Utility-Oriented Underwater Image Quality Assessment Based on Transfer Learning0
Time-Series Domain Adaptation via Sparse Associative Structure Alignment: Learning Invariance and Variance0
RCMNet: A deep learning model assists CAR-T therapy for leukemia0
Dynamically writing coupled memories using a reinforcement learning agent, meeting physical bounds0
Large Scale Transfer Learning for Differentially Private Image Classification0
Transferring Chemical and Energetic Knowledge Between Molecular Systems with Machine Learning0
Understanding Transfer Learning for Chest Radiograph Clinical Report Generation with Modified Transformer Architectures0
ON-TRAC Consortium Systems for the IWSLT 2022 Dialect and Low-resource Speech Translation Tasks0
Evaluating Transferability for Covid 3D Localization Using CT SARS-CoV-2 segmentation models0
XLTime: A Cross-Lingual Knowledge Transfer Framework for Temporal Expression ExtractionCode0
MTTrans: Cross-Domain Object Detection with Mean-Teacher TransformerCode1
Neural Language Taskonomy: Which NLP Tasks are the most Predictive of fMRI Brain Activity?0
Kompetencer: Fine-grained Skill Classification in Danish Job Postings via Distant Supervision and Transfer LearningCode0
FINETUNA: Fine-tuning Accelerated Molecular Simulations0
Make the Best of Cross-lingual Transfer: Evidence from POS Tagging with over 100 LanguagesCode1
Disentangled Knowledge Transfer for OOD Intent Discovery with Unified Contrastive LearningCode1
Evaluating zero-shot transfers and multilingual models for dependency parsing and POS tagging within the low-resource language family Tupían0
S^4-Tuning: A Simple Cross-lingual Sub-network Tuning Method0
A Checkpoint on Multilingual Misogyny Identification0
Towards Detecting Political Bias in Hindi News Articles0
Transfer Learning and Prediction Consistency for Detecting Offensive Spans of Text0
Uncertainty Regularized Multi-Task Learning0
What does it take to bake a cake? The RecipeRef corpus and anaphora resolution in procedural textCode0
When does CLIP generalize better than unimodal models? When judging human-centric concepts0
Conversational Bots for Psychotherapy: A Study of Generative Transformer Models Using Domain-specific Dialogues0
Leveraging Seq2seq Language Generation for Multi-level Product Issue Identification0
Cross-lingual Semantic Role Labelling with the ValPaL Database Knowledge0
Challenges in including extra-linguistic context in pre-trained language models0
Shallow Parsing for Nepal Bhasa Complement Clauses0
MUCS@Text-LT-EDI@ACL 2022: Detecting Sign of Depression from Social Media Text using Supervised Learning Approach0
On Target Representation in Continuous-output Neural Machine Translation0
An Unsupervised Multiple-Task and Multiple-Teacher Model for Cross-lingual Named Entity Recognition0
How Can Cross-lingual Knowledge Contribute Better to Fine-Grained Entity Typing?0
Show:102550
← PrevPage 97 of 207Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1APCLIPAccuracy84.2Unverified
2DFA-ENTAccuracy69.2Unverified
3DFA-SAFNAccuracy69.1Unverified
4EasyTLAccuracy63.3Unverified
5MEDAAccuracy60.3Unverified
#ModelMetricClaimedVerifiedStatus
1CNN10-20% Mask PSNR3.23Unverified
#ModelMetricClaimedVerifiedStatus
1Chatterjee, Dutta et al.[1]Accuracy96.12Unverified
#ModelMetricClaimedVerifiedStatus
1Co-TuningAccuracy85.65Unverified
#ModelMetricClaimedVerifiedStatus
1Physical AccessEER5.74Unverified
#ModelMetricClaimedVerifiedStatus
1riadd.aucmediAUROC0.95Unverified