SOTAVerified

Transfer Learning

Transfer Learning is a machine learning technique where a model trained on one task is re-purposed and fine-tuned for a related, but different task. The idea behind transfer learning is to leverage the knowledge learned from a pre-trained model to solve a new, but related problem. This can be useful in situations where there is limited data available to train a new model from scratch, or when the new task is similar enough to the original task that the pre-trained model can be adapted to the new problem with only minor modifications.

( Image credit: Subodh Malgonde )

Papers

Showing 826850 of 10307 papers

TitleStatusHype
Few-Shot Transfer Learning for Device-Free Fingerprinting Indoor LocalizationCode1
IGLUE: A Benchmark for Transfer Learning across Modalities, Tasks, and LanguagesCode1
ViT-HGR: Vision Transformer-based Hand Gesture Recognition from High Density Surface EMG SignalsCode1
AttentionHTR: Handwritten Text Recognition Based on Attention Encoder-Decoder NetworksCode1
On the adaptation of recurrent neural networks for system identificationCode1
Revisiting Weakly Supervised Pre-Training of Visual Perception ModelsCode1
An Empirical Investigation of Model-to-Model Distribution Shifts in Trained Convolutional FiltersCode1
Zero-Shot Machine UnlearningCode1
Assemble Foundation Models for Automatic Code SummarizationCode1
Head2Toe: Utilizing Intermediate Representations for Better Transfer LearningCode1
Robust and Resource-Efficient Data-Free Knowledge Distillation by Generative Pseudo ReplayCode1
A 1D CNN for high accuracy classification and transfer learning in motor imagery EEG-based brain-computer interfaceCode1
Weakly-supervised continual learning for class-incremental segmentationCode1
Learning Multiple Adverse Weather Removal via Two-Stage Knowledge Learning and Multi-Contrastive Regularization: Toward a Unified ModelCode1
Representation Topology Divergence: A Method for Comparing Neural Network RepresentationsCode1
A proposal for Multimodal Emotion Recognition using aural transformers and Action Units on RAVDESS datasetCode1
Confidence-Aware Multi-Teacher Knowledge DistillationCode1
Fine-Tuning Transformers: Vocabulary TransferCode1
Parameter Differentiation based Multilingual Neural Machine TranslationCode1
Continual Learning with Knowledge Transfer for Sentiment ClassificationCode1
Pixel Distillation: A New Knowledge Distillation Scheme for Low-Resolution Image RecognitionCode1
PeopleSansPeople: A Synthetic Data Generator for Human-Centric Computer VisionCode1
Learning from Guided Play: A Scheduled Hierarchical Approach for Improving Exploration in Adversarial Imitation LearningCode1
RegionCLIP: Region-based Language-Image PretrainingCode1
Connecting the Dots between Audio and Text without Parallel Data through Visual Knowledge TransferCode1
Show:102550
← PrevPage 34 of 413Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1APCLIPAccuracy84.2Unverified
2DFA-ENTAccuracy69.2Unverified
3DFA-SAFNAccuracy69.1Unverified
4EasyTLAccuracy63.3Unverified
5MEDAAccuracy60.3Unverified
#ModelMetricClaimedVerifiedStatus
1CNN10-20% Mask PSNR3.23Unverified
#ModelMetricClaimedVerifiedStatus
1Chatterjee, Dutta et al.[1]Accuracy96.12Unverified
#ModelMetricClaimedVerifiedStatus
1Co-TuningAccuracy85.65Unverified
#ModelMetricClaimedVerifiedStatus
1Physical AccessEER5.74Unverified
#ModelMetricClaimedVerifiedStatus
1riadd.aucmediAUROC0.95Unverified