SOTAVerified

Transfer Learning

Transfer Learning is a machine learning technique where a model trained on one task is re-purposed and fine-tuned for a related, but different task. The idea behind transfer learning is to leverage the knowledge learned from a pre-trained model to solve a new, but related problem. This can be useful in situations where there is limited data available to train a new model from scratch, or when the new task is similar enough to the original task that the pre-trained model can be adapted to the new problem with only minor modifications.

( Image credit: Subodh Malgonde )

Papers

Showing 1022610250 of 10307 papers

TitleStatusHype
SpectralKD: A Unified Framework for Interpreting and Distilling Vision Transformers via Spectral AnalysisCode0
Spectrum Prediction With Deep 3D Pyramid Vision Transformer LearningCode0
Word-level Embeddings for Cross-Task Transfer Learning in Speech ProcessingCode0
Spider GAN: Leveraging Friendly Neighbors to Accelerate GAN TrainingCode0
Spike encoding techniques for IoT time-varying signals benchmarked on a neuromorphic classification taskCode0
Split-Brain Autoencoders: Unsupervised Learning by Cross-Channel PredictionCode0
SpotTune: Transfer Learning through Adaptive Fine-tuningCode0
SqueezeBERT: What can computer vision teach NLP about efficient neural networks?Code0
SQ-Whisper: Speaker-Querying based Whisper Model for Target-Speaker ASRCode0
SSCL-IDS: Enhancing Generalization of Intrusion Detection with Self-Supervised Contrastive LearningCode0
SSDA: Secure Source-Free Domain AdaptationCode0
SSM-Net for Plants Disease Identification in Low Data RegimeCode0
SSS: Semi-Supervised SAM-2 with Efficient Prompting for Medical Imaging SegmentationCode0
Stability of Graph Scattering TransformsCode0
StarCraft Micromanagement with Reinforcement Learning and Curriculum Transfer LearningCode0
Detecting False Data Injection Attacks in Smart Grids with Modeling Errors: A Deep Transfer Learning Based ApproachCode0
Steering a Historical Disease Forecasting Model Under a Pandemic: Case of Flu and COVID-19Code0
StochCA: A Novel Approach for Exploiting Pretrained Models with Cross-AttentionCode0
Straightforward Layer-wise Pruning for More Efficient Visual AdaptationCode0
Strategies for Pretraining Neural OperatorsCode0
Simplest Streaming TreesCode0
Streaming Detection of Queried Event StartCode0
Structural Alignment in Link PredictionCode0
Structured Probabilistic Pruning for Convolutional Neural Network AccelerationCode0
Structure Mapping for Transferability of Causal ModelsCode0
Show:102550
← PrevPage 410 of 413Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1APCLIPAccuracy84.2Unverified
2DFA-ENTAccuracy69.2Unverified
3DFA-SAFNAccuracy69.1Unverified
4EasyTLAccuracy63.3Unverified
5MEDAAccuracy60.3Unverified
#ModelMetricClaimedVerifiedStatus
1CNN10-20% Mask PSNR3.23Unverified
#ModelMetricClaimedVerifiedStatus
1Chatterjee, Dutta et al.[1]Accuracy96.12Unverified
#ModelMetricClaimedVerifiedStatus
1Co-TuningAccuracy85.65Unverified
#ModelMetricClaimedVerifiedStatus
1Physical AccessEER5.74Unverified
#ModelMetricClaimedVerifiedStatus
1riadd.aucmediAUROC0.95Unverified