SOTAVerified

Transfer Learning

Transfer Learning is a machine learning technique where a model trained on one task is re-purposed and fine-tuned for a related, but different task. The idea behind transfer learning is to leverage the knowledge learned from a pre-trained model to solve a new, but related problem. This can be useful in situations where there is limited data available to train a new model from scratch, or when the new task is similar enough to the original task that the pre-trained model can be adapted to the new problem with only minor modifications.

( Image credit: Subodh Malgonde )

Papers

Showing 89519000 of 10307 papers

TitleStatusHype
GPT-3 Models are Poor Few-Shot Learners in the Biomedical DomainCode0
Medical Crossing: a Cross-lingual Evaluation of Clinical Entity LinkingCode0
Exploring the Robustness of Task-oriented Dialogue Systems for Colloquial German VarietiesCode0
Grad2Task: Improved Few-shot Text Classification Using Gradients for Task RepresentationCode0
Sub-Word Alignment Is Still Useful: A Vest-Pocket Method for Enhancing Low-Resource Machine TranslationCode0
Exploring the potential of transfer learning for metamodels of heterogeneous material deformationCode0
Medical Image Segmentation Using Deep Learning: A SurveyCode0
Tensor Analysis with n-Mode Generalized Difference SubspaceCode0
Over-parameterised Shallow Neural Networks with Asymmetrical Node Scaling: Global Convergence Guarantees and Feature LearningCode0
Medical supervised masked autoencoders: Crafting a better masking strategy and efficient fine-tuning schedule for medical image classificationCode0
Chair Segments: A Compact Benchmark for the Study of Object SegmentationCode0
Cell reprogramming design by transfer learning of functional transcriptional networksCode0
MedMerge: Merging Models for Effective Transfer Learning to Medical Imaging TasksCode0
A Neural Grammatical Error Correction System Built On Better Pre-training and Sequential Transfer LearningCode0
Exploring the Limits of Weakly Supervised PretrainingCode0
Self-training solutions for the ICCV 2023 GeoNet ChallengeCode0
RelO: An Overlapping Relation Extraction Dataset and ModelCode0
Graph-based Knowledge Distillation by Multi-head Attention NetworkCode0
GraphBridge: Towards Arbitrary Transfer Learning in GNNsCode0
Graph Constrained Data Representation Learning for Human Motion SegmentationCode0
Exploring the Effectiveness and Consistency of Task Selection in Intermediate-Task Transfer LearningCode0
Deep into The Domain Shift: Transfer Learning through Dependence RegularizationCode0
Graph Distillation for Action Detection with Privileged ModalitiesCode0
Exploring the Benefits of Visual Prompting in Differential PrivacyCode0
Celebrity ProfilingCode0
MedViLaM: A multimodal large language model with advanced generalizability and explainability for medical data understanding and generationCode0
Graph Few-shot Learning via Knowledge TransferCode0
Exploring the Benefits of Differentially Private Pre-training and Parameter-Efficient Fine-tuning for Table TransformersCode0
Exploring Target Representations for Masked AutoencodersCode0
Leveraging Cross-Lingual Transfer Learning in Spoken Named Entity Recognition SystemsCode0
AdaTriplet-RA: Domain Matching via Adaptive Triplet and Reinforced Attention for Unsupervised Domain AdaptationCode0
Deep Image-to-Recipe TranslationCode0
Exploring Self-Supervised Representation Learning For Low-Resource Medical Image AnalysisCode0
Graph Neural Networks for Surfactant Multi-Property PredictionCode0
Deep image representations using caption generatorsCode0
Exploring Pre-Trained Transformers and Bilingual Transfer Learning for Arabic Coreference ResolutionCode0
Exploring Open-world Continual Learning with Knowns-Unknowns Knowledge TransferCode0
MELEP: A Novel Predictive Measure of Transferability in Multi-Label ECG DiagnosisCode0
CEIMVEN: An Approach of Cutting Edge Implementation of Modified Versions of EfficientNet (V1-V2) Architecture for Breast Cancer Detection and Classification from Ultrasound ImagesCode0
TAPAS: Weakly Supervised Table Parsing via Pre-trainingCode0
Memebusters at SemEval-2020 Task 8: Feature Fusion Model for Sentiment Analysis on Memes Using Transfer LearningCode0
Deep Image Compression via End-to-End LearningCode0
Graph-Sequential Alignment and Uniformity: Toward Enhanced Recommendation SystemsCode0
An Embarrassingly Simple Approach for Transfer Learning from Pretrained Language ModelsCode0
CBM: Curriculum by MaskingCode0
GreekBART: The First Pretrained Greek Sequence-to-Sequence ModelCode0
Deep Hybrid Architecture for Very Low-Resolution Image Classification Using Capsule AttentionCode0
SpaceQA: Answering Questions about the Design of Space Missions and Space Craft ConceptsCode0
SU-Net: Pose estimation network for non-cooperative spacecraft on-orbitCode0
Removing Non-Stationary Knowledge From Pre-Trained Language Models for Entity-Level Sentiment Classification in FinanceCode0
Show:102550
← PrevPage 180 of 207Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1APCLIPAccuracy84.2Unverified
2DFA-ENTAccuracy69.2Unverified
3DFA-SAFNAccuracy69.1Unverified
4EasyTLAccuracy63.3Unverified
5MEDAAccuracy60.3Unverified
#ModelMetricClaimedVerifiedStatus
1CNN10-20% Mask PSNR3.23Unverified
#ModelMetricClaimedVerifiedStatus
1Chatterjee, Dutta et al.[1]Accuracy96.12Unverified
#ModelMetricClaimedVerifiedStatus
1Co-TuningAccuracy85.65Unverified
#ModelMetricClaimedVerifiedStatus
1Physical AccessEER5.74Unverified
#ModelMetricClaimedVerifiedStatus
1riadd.aucmediAUROC0.95Unverified