SOTAVerified

Transfer Learning

Transfer Learning is a machine learning technique where a model trained on one task is re-purposed and fine-tuned for a related, but different task. The idea behind transfer learning is to leverage the knowledge learned from a pre-trained model to solve a new, but related problem. This can be useful in situations where there is limited data available to train a new model from scratch, or when the new task is similar enough to the original task that the pre-trained model can be adapted to the new problem with only minor modifications.

( Image credit: Subodh Malgonde )

Papers

Showing 201225 of 10307 papers

TitleStatusHype
Can LLM Watermarks Robustly Prevent Unauthorized Knowledge Distillation?Code1
M-ABSA: A Multilingual Dataset for Aspect-Based Sentiment AnalysisCode1
Hi-End-MAE: Hierarchical encoder-driven masked autoencoders are stronger vision learners for medical image segmentationCode1
Instance-dependent Early StoppingCode1
A Data-Efficient Pan-Tumor Foundation Model for Oncology CT InterpretationCode1
UniGraph2: Learning a Unified Embedding Space to Bind Multimodal GraphsCode1
ARWKV: Pretrain is not what we need, an RNN-Attention-Based Language Model Born from TransformerCode1
WFCRL: A Multi-Agent Reinforcement Learning Benchmark for Wind Farm ControlCode1
Tackling Small Sample Survival Analysis via Transfer Learning: A Study of Colorectal Cancer PrognosisCode1
UniTrans: A Unified Vertical Federated Knowledge Transfer Framework for Enhancing Cross-Hospital CollaborationCode1
Surrogate-based multiscale analysis of experiments on thermoplastic composites under off-axis loadingCode1
Super-class guided Transformer for Zero-Shot Attribute ClassificationCode1
AD-L-JEPA: Self-Supervised Spatial World Models with Joint Embedding Predictive Architecture for Autonomous Driving with LiDAR DataCode1
Load Forecasting for Households and Energy Communities: Are Deep Learning Models Worth the Effort?Code1
SimLTD: Simple Supervised and Semi-Supervised Long-Tailed Object DetectionCode1
EEG-Reptile: An Automatized Reptile-Based Meta-Learning Library for BCIsCode1
Bridging the User-side Knowledge Gap in Knowledge-aware Recommendations with Large Language ModelsCode1
Relation-Guided Adversarial Learning for Data-free Knowledge TransferCode1
Skip Tuning: Pre-trained Vision-Language Models are Effective and Efficient Adapters ThemselvesCode1
MultiEYE: Dataset and Benchmark for OCT-Enhanced Retinal Disease Recognition from Fundus ImagesCode1
GEAL: Generalizable 3D Affordance Learning with Cross-Modal ConsistencyCode1
Monte Carlo Tree Search based Space Transfer for Black-box OptimizationCode1
T-TIME: Test-Time Information Maximization Ensemble for Plug-and-Play BCIsCode1
Knowledge Transfer and Domain Adaptation for Fine-Grained Remote Sensing Image SegmentationCode1
Finite Element Neural Network Interpolation. Part I: Interpretable and Adaptive Discretization for Solving PDEsCode1
Show:102550
← PrevPage 9 of 413Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1APCLIPAccuracy84.2Unverified
2DFA-ENTAccuracy69.2Unverified
3DFA-SAFNAccuracy69.1Unverified
4EasyTLAccuracy63.3Unverified
5MEDAAccuracy60.3Unverified
#ModelMetricClaimedVerifiedStatus
1CNN10-20% Mask PSNR3.23Unverified
#ModelMetricClaimedVerifiedStatus
1Chatterjee, Dutta et al.[1]Accuracy96.12Unverified
#ModelMetricClaimedVerifiedStatus
1Co-TuningAccuracy85.65Unverified
#ModelMetricClaimedVerifiedStatus
1Physical AccessEER5.74Unverified
#ModelMetricClaimedVerifiedStatus
1riadd.aucmediAUROC0.95Unverified