SOTAVerified

Transfer Learning

Transfer Learning is a machine learning technique where a model trained on one task is re-purposed and fine-tuned for a related, but different task. The idea behind transfer learning is to leverage the knowledge learned from a pre-trained model to solve a new, but related problem. This can be useful in situations where there is limited data available to train a new model from scratch, or when the new task is similar enough to the original task that the pre-trained model can be adapted to the new problem with only minor modifications.

( Image credit: Subodh Malgonde )

Papers

Showing 101150 of 10307 papers

TitleStatusHype
Finetuning Large Language Models for Vulnerability DetectionCode2
MMA: Multi-Modal Adapter for Vision-Language ModelsCode2
Graph Domain Adaptation: Challenges, Progress and ProspectsCode2
jiant: A Software Toolkit for Research on General-Purpose Text Understanding ModelsCode2
3D UX-Net: A Large Kernel Volumetric ConvNet Modernizing Hierarchical Transformer for Medical Image SegmentationCode2
NeRF-MAE: Masked AutoEncoders for Self-Supervised 3D Representation Learning for Neural Radiance FieldsCode2
ExT5: Towards Extreme Multi-Task Scaling for Transfer LearningCode2
External Knowledge Injection for CLIP-Based Class-Incremental LearningCode2
On-Device Training Under 256KB MemoryCode2
On Efficient Reinforcement Learning for Full-length Game of StarCraft IICode2
Exploring the Effect of Dataset Diversity in Self-Supervised Learning for Surgical Computer VisionCode2
AdapterFusion: Non-Destructive Task Composition for Transfer LearningCode2
Exploring the Limits of Transfer Learning with a Unified Text-to-Text TransformerCode2
Enhancing Zero-Shot Facial Expression Recognition by LLM Knowledge TransferCode2
Efficient Remote Sensing with Harmonized Transfer Learning and Modality AlignmentCode2
Event Stream-based Visual Object Tracking: A High-Resolution Benchmark Dataset and A Novel BaselineCode2
All-in-one foundational models learning across quantum chemical levelsCode2
DinoBloom: A Foundation Model for Generalizable Cell Embeddings in HematologyCode2
Do MIL Models Transfer?Code2
ExpeL: LLM Agents Are Experiential LearnersCode2
Global birdsong embeddings enable superior transfer learning for bioacoustic classificationCode2
Sky-image-based solar forecasting using deep learning with multi-location data: training models locally, globally or via transfer learning?Code2
Deep Neural Networks to Detect Weeds from Crops in Agricultural Environments in Real-Time: A ReviewCode2
A Survey on Open-Vocabulary Detection and Segmentation: Past, Present, and FutureCode2
A Survey on Remote Sensing Foundation Models: From Vision to MultimodalityCode2
A Survey on Time-Series Pre-Trained ModelsCode2
Deep Learning-Enabled Semantic Communication Systems with Task-Unaware Transmitter and Dynamic DataCode2
Densely Connected Parameter-Efficient Tuning for Referring Image SegmentationCode2
Current Trends in Deep Learning for Earth Observation: An Open-source Benchmark Arena for Image ClassificationCode2
Constructing and Exploring Intermediate Domains in Mixed Domain Semi-supervised Medical Image SegmentationCode2
PubLayNet: largest dataset ever for document layout analysisCode2
Content-Based Search for Deep Generative ModelsCode2
CLIP-Driven Universal Model for Organ Segmentation and Tumor DetectionCode2
CLAP: Learning Transferable Binary Code Representations with Natural Language SupervisionCode2
CLIP-Powered Domain Generalization and Domain Adaptation: A Comprehensive SurveyCode2
CARTE: Pretraining and Transfer for Tabular LearningCode2
Continual Pre-training of Language ModelsCode2
Cross-lingual Contextualized Topic Models with Zero-shot LearningCode2
BiomedGPT: A Generalist Vision-Language Foundation Model for Diverse Biomedical TasksCode2
Spatio-Temporal Few-Shot Learning via Diffusive Neural Network GenerationCode2
Deep learning for time series classificationCode2
Deep Model ReassemblyCode2
CascadeTabNet: An approach for end to end table detection and structure recognition from image-based documentsCode2
Discovery of 2D materials using Transformer Network based Generative DesignCode2
Automated MRI Quality Assessment of Brain T1-weighted MRI in Clinical Data Warehouses: A Transfer Learning Approach Relying on Artefact SimulationCode2
Dynamic Adapter Meets Prompt Tuning: Parameter-Efficient Transfer Learning for Point Cloud AnalysisCode2
AUFormer: Vision Transformers are Parameter-Efficient Facial Action Unit DetectorsCode2
AXIAL: Attention-based eXplainability for Interpretable Alzheimer's Localized Diagnosis using 2D CNNs on 3D MRI brain scansCode2
Actuarial Applications of Natural Language Processing Using Transformers: Case Studies for Using Text Features in an Actuarial ContextCode2
A Transfer Learning and Optimized CNN Based Intrusion Detection System for Internet of VehiclesCode2
Show:102550
← PrevPage 3 of 207Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1APCLIPAccuracy84.2Unverified
2DFA-ENTAccuracy69.2Unverified
3DFA-SAFNAccuracy69.1Unverified
4EasyTLAccuracy63.3Unverified
5MEDAAccuracy60.3Unverified
#ModelMetricClaimedVerifiedStatus
1CNN10-20% Mask PSNR3.23Unverified
#ModelMetricClaimedVerifiedStatus
1Chatterjee, Dutta et al.[1]Accuracy96.12Unverified
#ModelMetricClaimedVerifiedStatus
1Co-TuningAccuracy85.65Unverified
#ModelMetricClaimedVerifiedStatus
1Physical AccessEER5.74Unverified
#ModelMetricClaimedVerifiedStatus
1riadd.aucmediAUROC0.95Unverified