SOTAVerified

Transfer Learning

Transfer Learning is a machine learning technique where a model trained on one task is re-purposed and fine-tuned for a related, but different task. The idea behind transfer learning is to leverage the knowledge learned from a pre-trained model to solve a new, but related problem. This can be useful in situations where there is limited data available to train a new model from scratch, or when the new task is similar enough to the original task that the pre-trained model can be adapted to the new problem with only minor modifications.

( Image credit: Subodh Malgonde )

Papers

Showing 14761500 of 10307 papers

TitleStatusHype
EfficientFER: EfficientNetv2 Based Deep Learning Approach for Facial Expression RecognitionCode1
Efficient Fine-tuning of Audio Spectrogram Transformers via Soft Mixture of AdaptersCode1
Salient Object Detection in Optical Remote Sensing Images Driven by TransformerCode1
An Ensemble Approach for Automated Theorem Proving Based on Efficient Name Invariant Graph Neural RepresentationsCode1
Efficient parametrization of multi-domain deep neural networksCode1
IEPT: Instance-Level and Episode-Level Pretext Tasks for Few-Shot LearningCode1
Efficient Training of Large Vision Models via Advanced Automated Progressive LearningCode1
Learning Efficient Vision Transformers via Fine-Grained Manifold DistillationCode1
Efficient Visual Pretraining with Contrastive DetectionCode1
ScaleFL: Resource-Adaptive Federated Learning With Heterogeneous ClientsCode1
Investigating Transfer Learning in Multilingual Pre-trained Language Models through Chinese Natural Language InferenceCode1
A deep learning framework for solution and discovery in solid mechanicsCode1
Learning with Alignments: Tackling the Inter- and Intra-domain Shifts for Cross-multidomain Facial Expression RecognitionCode1
Zero-Shot Self-Supervised Learning for MRI ReconstructionCode1
Schema2QA: High-Quality and Low-Cost Q&A Agents for the Structured WebCode1
Unlocking Emergent Modularity in Large Language ModelsCode1
Bilevel Continual LearningCode1
SCT: A Simple Baseline for Parameter-Efficient Fine-Tuning via Salient ChannelsCode1
Emergent Communication Pretraining for Few-Shot Machine TranslationCode1
Multilingual Knowledge Graph Completion via Ensemble Knowledge TransferCode1
EmoNet: A Transfer Learning Framework for Multi-Corpus Speech Emotion RecognitionCode1
Serial Contrastive Knowledge Distillation for Continual Few-shot Relation ExtractionCode1
Emotion Recognition from Speech Using Wav2vec 2.0 EmbeddingsCode1
Empowering parameter-efficient transfer learning by recognizing the kernel structure in self-attentionCode1
WARP: Word-level Adversarial ReProgrammingCode1
Show:102550
← PrevPage 60 of 413Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1APCLIPAccuracy84.2Unverified
2DFA-ENTAccuracy69.2Unverified
3DFA-SAFNAccuracy69.1Unverified
4EasyTLAccuracy63.3Unverified
5MEDAAccuracy60.3Unverified
#ModelMetricClaimedVerifiedStatus
1CNN10-20% Mask PSNR3.23Unverified
#ModelMetricClaimedVerifiedStatus
1Chatterjee, Dutta et al.[1]Accuracy96.12Unverified
#ModelMetricClaimedVerifiedStatus
1Co-TuningAccuracy85.65Unverified
#ModelMetricClaimedVerifiedStatus
1Physical AccessEER5.74Unverified
#ModelMetricClaimedVerifiedStatus
1riadd.aucmediAUROC0.95Unverified