SOTAVerified

Transfer Learning

Transfer Learning is a machine learning technique where a model trained on one task is re-purposed and fine-tuned for a related, but different task. The idea behind transfer learning is to leverage the knowledge learned from a pre-trained model to solve a new, but related problem. This can be useful in situations where there is limited data available to train a new model from scratch, or when the new task is similar enough to the original task that the pre-trained model can be adapted to the new problem with only minor modifications.

( Image credit: Subodh Malgonde )

Papers

Showing 476500 of 10307 papers

TitleStatusHype
Improving Zero-Shot Generalization for CLIP with Synthesized PromptsCode1
AnyStar: Domain randomized universal star-convex 3D instance segmentationCode1
MDViT: Multi-domain Vision Transformer for Small Medical Image Segmentation DatasetsCode1
SAM-DA: UAV Tracks Anything at Night with SAM-Powered Domain AdaptationCode1
Audio Embeddings as Teachers for Music ClassificationCode1
BuildingsBench: A Large-Scale Dataset of 900K Buildings and Benchmark for Short-Term Load ForecastingCode1
DUET: 2D Structured and Approximately Equivariant RepresentationsCode1
Parameter-Level Soft-Masking for Continual LearningCode1
A systematic approach to deep learning-based nodule detection in chest radiographsCode1
Masking meets Supervision: A Strong Learning AllianceCode1
Synthetic optical coherence tomography angiographs for detailed retinal vessel segmentation without human annotationsCode1
BioREx: Improving Biomedical Relation Extraction by Leveraging Heterogeneous DatasetsCode1
Neural Priming for Sample-Efficient AdaptationCode1
LabelBench: A Comprehensive Framework for Benchmarking Adaptive Label-Efficient LearningCode1
PoET: A generative model of protein families as sequences-of-sequencesCode1
Prompter: Zero-shot Adaptive Prefixes for Dialogue State Tracking Domain AdaptationCode1
Zambezi Voice: A Multilingual Speech Corpus for Zambian LanguagesCode1
Transferring Annotator- and Instance-dependent Transition Matrix for Learning from CrowdsCode1
Training Like a Medical Resident: Context-Prior Learning Toward Universal Medical Image SegmentationCode1
Transfer learning for atomistic simulations using GNNs and kernel mean embeddingsCode1
A Survey of Label-Efficient Deep Learning for 3D Point CloudsCode1
Point-GCC: Universal Self-supervised 3D Scene Pre-training via Geometry-Color ContrastCode1
Deeply Coupled Cross-Modal Prompt LearningCode1
Pre-training Contextualized World Models with In-the-wild Videos for Reinforcement LearningCode1
One Network, Many Masks: Towards More Parameter-Efficient Transfer LearningCode1
Show:102550
← PrevPage 20 of 413Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1APCLIPAccuracy84.2Unverified
2DFA-ENTAccuracy69.2Unverified
3DFA-SAFNAccuracy69.1Unverified
4EasyTLAccuracy63.3Unverified
5MEDAAccuracy60.3Unverified
#ModelMetricClaimedVerifiedStatus
1CNN10-20% Mask PSNR3.23Unverified
#ModelMetricClaimedVerifiedStatus
1Chatterjee, Dutta et al.[1]Accuracy96.12Unverified
#ModelMetricClaimedVerifiedStatus
1Co-TuningAccuracy85.65Unverified
#ModelMetricClaimedVerifiedStatus
1Physical AccessEER5.74Unverified
#ModelMetricClaimedVerifiedStatus
1riadd.aucmediAUROC0.95Unverified