SOTAVerified

Transfer Learning

Transfer Learning is a machine learning technique where a model trained on one task is re-purposed and fine-tuned for a related, but different task. The idea behind transfer learning is to leverage the knowledge learned from a pre-trained model to solve a new, but related problem. This can be useful in situations where there is limited data available to train a new model from scratch, or when the new task is similar enough to the original task that the pre-trained model can be adapted to the new problem with only minor modifications.

( Image credit: Subodh Malgonde )

Papers

Showing 67016725 of 10307 papers

TitleStatusHype
kk2018 at SemEval-2020 Task 9: Adversarial Training for Code-Mixing Sentiment Classification0
A Novel Transformer Network with Shifted Window Cross-Attention for Spatiotemporal Weather Forecasting0
KMF: Knowledge-Aware Multi-Faceted Representation Learning for Zero-Shot Node Classification0
k-Nearest Neighbor Augmented Neural Networks for Text Classification0
A Novel Transfer Learning Method Utilizing Acoustic and Vibration Signals for Rotating Machinery Fault Diagnosis0
A novel transfer learning method based on common space mapping and weighted domain matching0
Knee menisci segmentation and relaxometry of 3D ultrashort echo time (UTE) cones MR imaging using attention U-Net with transfer learning0
k-NN as a Simple and Effective Estimator of Transferability0
Knowledge as A Bridge: Improving Cross-domain Answer Selection with External Knowledge0
Knowledge-Aware Prompt Tuning for Generalizable Vision-Language Models0
A Novel Transfer Learning-Based Approach for Screening Pre-existing Heart Diseases Using Synchronized ECG Signals and Heart Sounds0
Knowledge-Based Learning through Feature Generation0
Sequential PatchCore: Anomaly Detection for Surface Inspection using Synthetic Impurities0
Knowledge capture, adaptation and composition (KCAC): A framework for cross-task curriculum learning in robotic manipulation0
A Novel Transfer Learning Approach upon Hindi, Arabic, and Bangla Numerals using Convolutional Neural Networks0
A Novel Spike Transformer Network for Depth Estimation from Event Cameras via Cross-modality Knowledge Distillation0
Knowledge Distillation Label Smoothing: Fact or Fallacy?0
Knowledge Distillation Based Semantic Communications For Multiple Users0
Knowledge Distillation of Black-Box Large Language Models0
Knowledge Distillation in Federated Learning: a Survey on Long Lasting Challenges and New Solutions0
Sequential Reptile: Inter-Task Gradient Alignment for Multilingual Learning0
Knowledge Distillation in Wide Neural Networks: Risk Bound, Data Efficiency and Imperfect Teacher0
A Novel Semi-supervised Meta Learning Method for Subject-transfer Brain-computer Interface0
Knowledge Distillation Under Ideal Joint Classifier Assumption0
Knowledge Distillation via Token-level Relationship Graph0
Show:102550
← PrevPage 269 of 413Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1APCLIPAccuracy84.2Unverified
2DFA-ENTAccuracy69.2Unverified
3DFA-SAFNAccuracy69.1Unverified
4EasyTLAccuracy63.3Unverified
5MEDAAccuracy60.3Unverified
#ModelMetricClaimedVerifiedStatus
1CNN10-20% Mask PSNR3.23Unverified
#ModelMetricClaimedVerifiedStatus
1Chatterjee, Dutta et al.[1]Accuracy96.12Unverified
#ModelMetricClaimedVerifiedStatus
1Co-TuningAccuracy85.65Unverified
#ModelMetricClaimedVerifiedStatus
1Physical AccessEER5.74Unverified
#ModelMetricClaimedVerifiedStatus
1riadd.aucmediAUROC0.95Unverified