SOTAVerified

Transfer Learning

Transfer Learning is a machine learning technique where a model trained on one task is re-purposed and fine-tuned for a related, but different task. The idea behind transfer learning is to leverage the knowledge learned from a pre-trained model to solve a new, but related problem. This can be useful in situations where there is limited data available to train a new model from scratch, or when the new task is similar enough to the original task that the pre-trained model can be adapted to the new problem with only minor modifications.

( Image credit: Subodh Malgonde )

Papers

Showing 151200 of 10307 papers

TitleStatusHype
CLIP-Driven Universal Model for Organ Segmentation and Tumor DetectionCode2
CascadeTabNet: An approach for end to end table detection and structure recognition from image-based documentsCode2
Finetuning Large Language Models for Vulnerability DetectionCode2
FlashST: A Simple and Universal Prompt-Tuning Framework for Traffic PredictionCode2
Graph Domain Adaptation: Challenges, Progress and ProspectsCode2
GroupViT: Semantic Segmentation Emerges from Text SupervisionCode2
3D UX-Net: A Large Kernel Volumetric ConvNet Modernizing Hierarchical Transformer for Medical Image SegmentationCode2
CARTE: Pretraining and Transfer for Tabular LearningCode2
jiant: A Software Toolkit for Research on General-Purpose Text Understanding ModelsCode2
K-LITE: Learning Transferable Visual Models with External KnowledgeCode2
Large Scale Transfer Learning for Tabular Data via Language ModelingCode2
AddressCLIP: Empowering Vision-Language Models for City-wide Image Address LocalizationCode2
CLAP: Learning Transferable Binary Code Representations with Natural Language SupervisionCode2
CommonCanvas: An Open Diffusion Model Trained with Creative-Commons ImagesCode2
Deep Learning-Enabled Semantic Communication Systems with Task-Unaware Transmitter and Dynamic DataCode2
Lion: Adversarial Distillation of Proprietary Large Language ModelsCode2
LST: Ladder Side-Tuning for Parameter and Memory Efficient Transfer LearningCode2
InPars: Data Augmentation for Information Retrieval using Large Language ModelsCode2
SF2Former: Amyotrophic Lateral Sclerosis Identification From Multi-center MRI Data Using Spatial and Frequency Fusion TransformerCode2
MIGE: A Unified Framework for Multimodal Instruction-Based Image Generation and EditingCode2
BioREx: Improving Biomedical Relation Extraction by Leveraging Heterogeneous DatasetsCode1
BIOSCAN-5M: A Multimodal Dataset for Insect BiodiversityCode1
Beyond Semantic to Instance Segmentation: Weakly-Supervised Instance Segmentation via Semantic Knowledge Transfer and Self-RefinementCode1
Analysis of skin lesion images with deep learningCode1
Bilevel Continual LearningCode1
BirdSAT: Cross-View Contrastive Masked Autoencoders for Bird Species Classification and MappingCode1
Benchmarking Detection Transfer Learning with Vision TransformersCode1
Bert4XMR: Cross-Market Recommendation with Bidirectional Encoder Representations from TransformerCode1
Accurate Clinical Toxicity Prediction using Multi-task Deep Neural Nets and Contrastive Molecular ExplanationsCode1
Accuracy enhancement method for speech emotion recognition from spectrogram using temporal frequency correlation and positional information learning through knowledge transferCode1
Bayesian Optimization with Automatic Prior Selection for Data-Efficient Direct Policy SearchCode1
Beyond Self-Supervision: A Simple Yet Effective Network Distillation Alternative to Improve BackbonesCode1
BiToD: A Bilingual Multi-Domain Dataset For Task-Oriented Dialogue ModelingCode1
BadMerging: Backdoor Attacks Against Model MergingCode1
Bag of Tricks for Image Classification with Convolutional Neural NetworksCode1
Domain Prompt Learning for Efficiently Adapting CLIP to Unseen DomainsCode1
Amplifying Membership Exposure via Data PoisoningCode1
BARThez: a Skilled Pretrained French Sequence-to-Sequence ModelCode1
AMMUS : A Survey of Transformer-based Pretrained Models in Natural Language ProcessingCode1
2021 BEETL Competition: Advancing Transfer Learning for Subject Independence & Heterogenous EEG Data SetsCode1
AVocaDo: Strategy for Adapting Vocabulary to Downstream DomainCode1
Neuro2Semantic: A Transfer Learning Framework for Semantic Reconstruction of Continuous Language from Human Intracranial EEGCode1
Anatomical Foundation Models for Brain MRIsCode1
A Whisper transformer for audio captioning trained with synthetic captions and transfer learningCode1
Enhanced Gaussian Process Dynamical Models with Knowledge Transfer for Long-term Battery Degradation ForecastingCode1
BlackVIP: Black-Box Visual Prompting for Robust Transfer LearningCode1
Automatic Noise Filtering with Dynamic Sparse Training in Deep Reinforcement LearningCode1
AutoTune: Automatically Tuning Convolutional Neural Networks for Improved Transfer LearningCode1
Aligning Pretraining for Detection via Object-Level Contrastive LearningCode1
SentenceMIM: A Latent Variable Language ModelCode1
Show:102550
← PrevPage 4 of 207Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1APCLIPAccuracy84.2Unverified
2DFA-ENTAccuracy69.2Unverified
3DFA-SAFNAccuracy69.1Unverified
4EasyTLAccuracy63.3Unverified
5MEDAAccuracy60.3Unverified
#ModelMetricClaimedVerifiedStatus
1CNN10-20% Mask PSNR3.23Unverified
#ModelMetricClaimedVerifiedStatus
1Chatterjee, Dutta et al.[1]Accuracy96.12Unverified
#ModelMetricClaimedVerifiedStatus
1Co-TuningAccuracy85.65Unverified
#ModelMetricClaimedVerifiedStatus
1Physical AccessEER5.74Unverified
#ModelMetricClaimedVerifiedStatus
1riadd.aucmediAUROC0.95Unverified