SOTAVerified

Transfer Learning

Transfer Learning is a machine learning technique where a model trained on one task is re-purposed and fine-tuned for a related, but different task. The idea behind transfer learning is to leverage the knowledge learned from a pre-trained model to solve a new, but related problem. This can be useful in situations where there is limited data available to train a new model from scratch, or when the new task is similar enough to the original task that the pre-trained model can be adapted to the new problem with only minor modifications.

( Image credit: Subodh Malgonde )

Papers

Showing 10011025 of 10307 papers

TitleStatusHype
One-stage Low-resolution Text Recognition with High-resolution Knowledge TransferCode1
A Simple Baseline for Bayesian Uncertainty in Deep LearningCode1
A simple, efficient and scalable contrastive masked autoencoder for learning visual representationsCode1
A Simple Language Model for Task-Oriented DialogueCode1
A Simple Multi-Modality Transfer Learning Baseline for Sign Language TranslationCode1
A Simple yet Effective Framework for Few-Shot Aspect-Based Sentiment AnalysisCode1
Bridging the User-side Knowledge Gap in Knowledge-aware Recommendations with Large Language ModelsCode1
A single-cell gene expression language modelCode1
On the effectiveness of task granularity for transfer learningCode1
On the Opportunities and Risks of Foundation ModelsCode1
BrainWave: A Brain Signal Foundation Model for Clinical ApplicationsCode1
On the Use of BERT for Automated Essay Scoring: Joint Learning of Multi-Scale Essay RepresentationCode1
A General-Purpose Self-Supervised Model for Computational PathologyCode1
OpenBox: A Generalized Black-box Optimization ServiceCode1
Open-Vocabulary Multi-Label Classification via Multi-Modal Knowledge TransferCode1
Boosting Weakly Supervised Object Detection via Learning Bounding Box AdjustersCode1
Breaking the Data Barrier -- Building GUI Agents Through Task GeneralizationCode1
Overcoming Data and Model Heterogeneities in Decentralized Federated Learning via Synthetic AnchorsCode1
Overview of the TREC 2019 deep learning trackCode1
Assemble Foundation Models for Automatic Code SummarizationCode1
PAC-Bayes Compression Bounds So Tight That They Can Explain GeneralizationCode1
Paced-Curriculum Distillation with Prediction and Label Uncertainty for Image SegmentationCode1
Boosted Neural Decoders: Achieving Extreme Reliability of LDPC Codes for 6G NetworksCode1
Adapting BERT for Word Sense Disambiguation with Gloss Selection Objective and Example SentencesCode1
Boosting Weakly Supervised Object Detection with Progressive Knowledge TransferCode1
Show:102550
← PrevPage 41 of 413Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1APCLIPAccuracy84.2Unverified
2DFA-ENTAccuracy69.2Unverified
3DFA-SAFNAccuracy69.1Unverified
4EasyTLAccuracy63.3Unverified
5MEDAAccuracy60.3Unverified
#ModelMetricClaimedVerifiedStatus
1CNN10-20% Mask PSNR3.23Unverified
#ModelMetricClaimedVerifiedStatus
1Chatterjee, Dutta et al.[1]Accuracy96.12Unverified
#ModelMetricClaimedVerifiedStatus
1Co-TuningAccuracy85.65Unverified
#ModelMetricClaimedVerifiedStatus
1Physical AccessEER5.74Unverified
#ModelMetricClaimedVerifiedStatus
1riadd.aucmediAUROC0.95Unverified