SOTAVerified

Transfer Learning

Transfer Learning is a machine learning technique where a model trained on one task is re-purposed and fine-tuned for a related, but different task. The idea behind transfer learning is to leverage the knowledge learned from a pre-trained model to solve a new, but related problem. This can be useful in situations where there is limited data available to train a new model from scratch, or when the new task is similar enough to the original task that the pre-trained model can be adapted to the new problem with only minor modifications.

( Image credit: Subodh Malgonde )

Papers

Showing 15011550 of 10307 papers

TitleStatusHype
Enhancing Speech Intelligibility in Text-To-Speech Synthesis using Speaking Style ConversionCode1
Semi-Supervised Learning with Taxonomic LabelsCode1
Estimating Q(s,s') with Deep Deterministic Dynamics GradientsCode1
SFace: Privacy-friendly and Accurate Face Recognition using Synthetic DataCode1
An Evaluation of Self-Supervised Pre-Training for Skin-Lesion AnalysisCode1
ShapeGlot: Learning Language for Shape DifferentiationCode1
I2I: Initializing Adapters with Improvised KnowledgeCode1
Show, Attend and Distill:Knowledge Distillation via Attention-based Feature MatchingCode1
An Evolutionary Multitasking Algorithm with Multiple Filtering for High-Dimensional Feature SelectionCode1
Improved Regularization and Robustness for Fine-tuning in Neural NetworksCode1
Improving Transferability of Representations via Augmentation-Aware Self-SupervisionCode1
Equivariant Graph Neural Networks for 3D Macromolecular StructureCode1
BioREx: Improving Biomedical Relation Extraction by Leveraging Heterogeneous DatasetsCode1
BIOSCAN-5M: A Multimodal Dataset for Insect BiodiversityCode1
Knowledge Inheritance for Pre-trained Language ModelsCode1
ERM-KTP: Knowledge-Level Machine Unlearning via Knowledge TransferCode1
BirdSAT: Cross-View Contrastive Masked Autoencoders for Bird Species Classification and MappingCode1
S-JEPA: towards seamless cross-dataset transfer through dynamic spatial attentionCode1
Evaluating Protein Transfer Learning with TAPECode1
BiToD: A Bilingual Multi-Domain Dataset For Task-Oriented Dialogue ModelingCode1
A New Knowledge Distillation Network for Incremental Few-Shot Surface Defect DetectionCode1
BlackVIP: Black-Box Visual Prompting for Robust Transfer LearningCode1
Evaluating Parameter-Efficient Transfer Learning Approaches on SURE Benchmark for Speech UnderstandingCode1
Soft Contrastive Learning for Time SeriesCode1
Blindly Assess Quality of In-the-Wild Videos via Quality-aware Pre-training and Motion PerceptionCode1
Evaluating histopathology transfer learning with ChampKitCode1
EViT: An Eagle Vision Transformer with Bi-Fovea Self-AttentionCode1
PointMCD: Boosting Deep Point Cloud Encoders via Multi-view Cross-modal Distillation for 3D Shape RecognitionCode1
AmbiFC: Fact-Checking Ambiguous Claims with EvidenceCode1
Meta-Knowledge Transfer for Inductive Knowledge Graph EmbeddingCode1
EXAMS: A Multi-Subject High School Examinations Dataset for Cross-Lingual and Multilingual Question AnsweringCode1
hULMonA: The Universal Language Model in ArabicCode0
Adaptation of Tacotron2-based Text-To-Speech for Articulatory-to-Acoustic Mapping using Ultrasound Tongue ImagingCode0
Assaying Out-Of-Distribution Generalization in Transfer LearningCode0
A Generative Adversarial Approach To ECG Synthesis And DenoisingCode0
Human Genome Book: Words, Sentences and ParagraphsCode0
A Split-then-Join Approach to Abstractive Summarization for Very Long Documents in a Low Resource SettingCode0
Aspect-augmented Adversarial Networks for Domain AdaptationCode0
HR-VILAGE-3K3M: A Human Respiratory Viral Immunization Longitudinal Gene Expression Dataset for Systems ImmunityCode0
HTR-JAND: Handwritten Text Recognition with Joint Attention Network and Knowledge DistillationCode0
Human-Inspired Framework to Accelerate Reinforcement LearningCode0
Asking Crowdworkers to Write Entailment Examples: The Best of Bad OptionsCode0
Adaptation of Deep Bidirectional Multilingual Transformers for Russian LanguageCode0
How Well Do Vision Transformers (VTs) Transfer To The Non-Natural Image Domain? An Empirical Study Involving Art ClassificationCode0
Asking and Answering Questions to Extract Event-Argument StructuresCode0
How to tackle an emerging topic? Combining strong and weak labels for Covid news NERCode0
How to Train a CAT: Learning Canonical Appearance Transformations for Direct Visual Localization Under Illumination ChangeCode0
ACE: Zero-Shot Image to Image Translation via Pretrained Auto-Contrastive-EncoderCode0
How to evaluate word embeddings? On importance of data efficiency and simple supervised tasksCode0
How transfer learning is used in generative models for image classification: improved accuracyCode0
Show:102550
← PrevPage 31 of 207Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1APCLIPAccuracy84.2Unverified
2DFA-ENTAccuracy69.2Unverified
3DFA-SAFNAccuracy69.1Unverified
4EasyTLAccuracy63.3Unverified
5MEDAAccuracy60.3Unverified
#ModelMetricClaimedVerifiedStatus
1CNN10-20% Mask PSNR3.23Unverified
#ModelMetricClaimedVerifiedStatus
1Chatterjee, Dutta et al.[1]Accuracy96.12Unverified
#ModelMetricClaimedVerifiedStatus
1Co-TuningAccuracy85.65Unverified
#ModelMetricClaimedVerifiedStatus
1Physical AccessEER5.74Unverified
#ModelMetricClaimedVerifiedStatus
1riadd.aucmediAUROC0.95Unverified