SOTAVerified

Transfer Learning

Transfer Learning is a machine learning technique where a model trained on one task is re-purposed and fine-tuned for a related, but different task. The idea behind transfer learning is to leverage the knowledge learned from a pre-trained model to solve a new, but related problem. This can be useful in situations where there is limited data available to train a new model from scratch, or when the new task is similar enough to the original task that the pre-trained model can be adapted to the new problem with only minor modifications.

( Image credit: Subodh Malgonde )

Papers

Showing 15011525 of 10307 papers

TitleStatusHype
Self-Supervised Dataset Distillation for Transfer LearningCode1
MOTOR: A Time-To-Event Foundation Model For Structured Medical RecordsCode1
Hyperspectral Classification Based on Lightweight 3-D-CNN With Transfer LearningCode1
Evaluating Protein Transfer Learning with TAPECode1
An Evaluation of Self-Supervised Pre-Training for Skin-Lesion AnalysisCode1
Enhancing Speech Intelligibility in Text-To-Speech Synthesis using Speaking Style ConversionCode1
Enhancing Traffic Safety with Parallel Dense Video Captioning for End-to-End Event AnalysisCode1
SemiReward: A General Reward Model for Semi-supervised LearningCode1
An Evolutionary Multitasking Algorithm with Multiple Filtering for High-Dimensional Feature SelectionCode1
Improving Candidate Generation for Low-resource Cross-lingual Entity LinkingCode1
Matrix Information Theory for Self-Supervised LearningCode1
Separate but Together: Unsupervised Federated Learning for Speech Enhancement from Non-IID DataCode1
BioREx: Improving Biomedical Relation Extraction by Leveraging Heterogeneous DatasetsCode1
BIOSCAN-5M: A Multimodal Dataset for Insect BiodiversityCode1
SFace: Privacy-friendly and Accurate Face Recognition using Synthetic DataCode1
Shape Adaptor: A Learnable Resizing ModuleCode1
BirdSAT: Cross-View Contrastive Masked Autoencoders for Bird Species Classification and MappingCode1
Entangled Watermarks as a Defense against Model ExtractionCode1
Soft Prompt Tuning for Augmenting Dense Retrieval with Large Language ModelsCode1
BiToD: A Bilingual Multi-Domain Dataset For Task-Oriented Dialogue ModelingCode1
A New Knowledge Distillation Network for Incremental Few-Shot Surface Defect DetectionCode1
BlackVIP: Black-Box Visual Prompting for Robust Transfer LearningCode1
SimLTD: Simple Supervised and Semi-Supervised Long-Tailed Object DetectionCode1
ERM-KTP: Knowledge-Level Machine Unlearning via Knowledge TransferCode1
Zero-1-to-3: Domain-level Zero-shot Cognitive Diagnosis via One Batch of Early-bird Students towards Three Diagnostic ObjectivesCode1
Show:102550
← PrevPage 61 of 413Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1APCLIPAccuracy84.2Unverified
2DFA-ENTAccuracy69.2Unverified
3DFA-SAFNAccuracy69.1Unverified
4EasyTLAccuracy63.3Unverified
5MEDAAccuracy60.3Unverified
#ModelMetricClaimedVerifiedStatus
1CNN10-20% Mask PSNR3.23Unverified
#ModelMetricClaimedVerifiedStatus
1Chatterjee, Dutta et al.[1]Accuracy96.12Unverified
#ModelMetricClaimedVerifiedStatus
1Co-TuningAccuracy85.65Unverified
#ModelMetricClaimedVerifiedStatus
1Physical AccessEER5.74Unverified
#ModelMetricClaimedVerifiedStatus
1riadd.aucmediAUROC0.95Unverified