SOTAVerified

Transfer Learning

Transfer Learning is a machine learning technique where a model trained on one task is re-purposed and fine-tuned for a related, but different task. The idea behind transfer learning is to leverage the knowledge learned from a pre-trained model to solve a new, but related problem. This can be useful in situations where there is limited data available to train a new model from scratch, or when the new task is similar enough to the original task that the pre-trained model can be adapted to the new problem with only minor modifications.

( Image credit: Subodh Malgonde )

Papers

Showing 96019625 of 10307 papers

TitleStatusHype
Input-gradient space particle inference for neural network ensemblesCode0
Instruction Tuned Models are Quick LearnersCode0
Integrated Parameter-Efficient Tuning for General-Purpose Audio ModelsCode0
Integrating Curricula with Replays: Its Effects on Continual LearningCode0
Integrating Transformer and Autoencoder Techniques with Spectral Graph Algorithms for the Prediction of Scarcely Labeled Molecular DataCode0
Intensive Care as One Big Sequence Modeling ProblemCode0
Interpretable Acoustic Representation Learning on Breathing and Speech Signals for COVID-19 DetectionCode0
Interpretable and Transferable Models to Understand the Impact of Lockdown Measures on Local Air QualityCode0
Interpretable Embedding Procedure Knowledge Transfer via Stacked Principal Component Analysis and Graph Neural NetworkCode0
Interpretable neural architecture search and transfer learning for understanding CRISPR/Cas9 off-target enzymatic reactionsCode0
Interpretation of Swedish Sign Language using Convolutional Neural Networks and Transfer LearningCode0
Interpretations of Domain Adaptations via Layer Variational AnalysisCode0
Interpreting and Exploiting Functional Specialization in Multi-Head Attention under Multi-task LearningCode0
Interspecies Knowledge Transfer for Facial Keypoint DetectionCode0
Intracerebral EEG Artifact Identification Using Convolutional Neural NetworksCode0
Intrinsically Motivated Open-Ended Multi-Task Learning Using Transfer Learning to Discover Task HierarchyCode0
Invariant Models for Causal Transfer LearningCode0
Inverse Design of Potential Singlet Fission Molecules using a Transfer Learning Based ApproachCode0
Bayesian Inverse Transfer in Evolutionary Multiobjective OptimizationCode0
Investigating label suggestions for opinion mining in German Covid-19 social mediaCode0
Investigating Neural Machine Translation for Low-Resource Languages: Using Bavarian as a Case StudyCode0
Investigating Numeracy Learning Ability of a Text-to-Text Transfer ModelCode0
Investigating Shallow and Deep Learning Techniques for Emotion Classification in Short Persian TextsCode0
Investigating Transferability in Pretrained Language ModelsCode0
Investigating Transfer Learning Capabilities of Vision Transformers and CNNs by Fine-Tuning a Single Trainable BlockCode0
Show:102550
← PrevPage 385 of 413Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1APCLIPAccuracy84.2Unverified
2DFA-ENTAccuracy69.2Unverified
3DFA-SAFNAccuracy69.1Unverified
4EasyTLAccuracy63.3Unverified
5MEDAAccuracy60.3Unverified
#ModelMetricClaimedVerifiedStatus
1CNN10-20% Mask PSNR3.23Unverified
#ModelMetricClaimedVerifiedStatus
1Chatterjee, Dutta et al.[1]Accuracy96.12Unverified
#ModelMetricClaimedVerifiedStatus
1Co-TuningAccuracy85.65Unverified
#ModelMetricClaimedVerifiedStatus
1Physical AccessEER5.74Unverified
#ModelMetricClaimedVerifiedStatus
1riadd.aucmediAUROC0.95Unverified