SOTAVerified

Transfer Learning

Transfer Learning is a machine learning technique where a model trained on one task is re-purposed and fine-tuned for a related, but different task. The idea behind transfer learning is to leverage the knowledge learned from a pre-trained model to solve a new, but related problem. This can be useful in situations where there is limited data available to train a new model from scratch, or when the new task is similar enough to the original task that the pre-trained model can be adapted to the new problem with only minor modifications.

( Image credit: Subodh Malgonde )

Papers

Showing 62516300 of 10307 papers

TitleStatusHype
Heterogeneous Multi-task Metric Learning across Multiple Domains0
Heterogeneous Representation Learning: A Review0
Heterogeneous domain adaptation: An unsupervised approach0
Heterogeneous transfer learning for high dimensional regression with feature mismatch0
Heterogeneous Transfer Learning in Ensemble Clustering0
Heterogenous Multi-Source Data Fusion Through Input Mapping and Latent Variable Gaussian Process0
Hey AI Can You Grade My Essay?: Automatic Essay Grading0
HFedCKD: Toward Robust Heterogeneous Federated Learning via Data-free Knowledge Distillation and Two-way Contrast0
Hierarchical Continual Reinforcement Learning via Large Language Model0
Hidden Layers in Perceptual Learning0
Hidden Markov Models and their Application for Predicting Failure Events0
Hidden Markov tree models for semantic class induction0
FacLens: Transferable Probe for Foreseeing Non-Factuality in Large Language Models0
Hidden State Variability of Pretrained Language Models Can Guide Computation Reduction for Transfer Learning0
Hierarchical Adaptive Expert for Multimodal Sentiment Analysis0
Hierarchical Cross-Modality Knowledge Transfer with Sinkhorn Attention for CTC-based ASR0
Hierarchical Deep Learning with Generative Adversarial Network for Automatic Cardiac Diagnosis from ECG Signals0
Hierarchical Granularity Transfer Learning0
Hierarchical Neural Network Approaches for Long Document Classification0
Hierarchical Optimal Transport for Comparing Histopathology Datasets0
Hierarchical Recurrent Aggregative Generation for Few-Shot NLG0
Hierarchical Recurrent Aggregative Generation for Few-Shot NLG0
Hierarchical Relation-Guided Type-Sentence Alignment for Long-Tail Relation Extraction with Distant Supervision0
Hierarchical Relation-Guided Type-Sentence Alignment for Long-Tail Relation Extraction with Distant Supervision0
Hierarchical Side-Tuning for Vision Transformers0
Hierarchical Transfer Learning for Multi-label Text Classification0
Exploring Optimal Deep Learning Models for Image-based Malware Variant Classification0
MERL: Multi-Head Reinforcement Learning0
Higher-order Knowledge Transfer for Dynamic Community Detection with Great Changes0
High-Fidelity Accelerated MRI Reconstruction by Scan-Specific Fine-Tuning of Physics-Based Neural Networks0
High-Order Deep Meta-Learning with Category-Theoretic Interpretation0
High-Resolution Detection of Earth Structural Heterogeneities from Seismic Amplitudes using Convolutional Neural Networks with Attention layers0
Hi Sigma, do I have the Coronavirus?: Call for a New Artificial Intelligence Approach to Support Health Care Professionals Dealing With The COVID-19 Pandemic0
Histology Virtual Staining with Mask-Guided Adversarial Transfer Learning for Tertiary Lymphoid Structure Detection0
HistoTransfer: Understanding Transfer Learning for Histopathology0
HMAE: Self-Supervised Few-Shot Learning for Quantum Spin Systems0
Holistic Multi-Slice Framework for Dynamic Simultaneous Multi-Slice MRI Reconstruction0
HoloFed: Environment-Adaptive Positioning via Multi-band Reconfigurable Holographic Surfaces and Federated Learning0
HomoDistil: Homotopic Task-Agnostic Distillation of Pre-trained Transformers0
Homomorphisms Between Transfer, Multi-Task, and Meta-Learning Systems0
Homophily and missing links in citation networks0
Hot PATE: Private Aggregation of Distributions for Diverse Task0
On Transfer of Adversarial Robustness from Pretraining to Downstream Tasks0
How Can Cross-lingual Knowledge Contribute Better to Fine-Grained Entity Typing?0
How Can We Accelerate Progress Towards Human-like Linguistic Generalization?0
How Different Text-preprocessing Techniques Using The BERT Model Affect The Gender Profiling of Authors0
How Does Adversarial Fine-Tuning Benefit BERT?0
How does a Multilingual LM Handle Multiple Languages?0
How does Architecture Influence the Base Capabilities of Pre-trained Language Models? A Case Study Based on FFN-Wider and MoE Transformers0
How Does Data Diversity Shape the Weight Landscape of Neural Networks?0
Show:102550
← PrevPage 126 of 207Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1APCLIPAccuracy84.2Unverified
2DFA-ENTAccuracy69.2Unverified
3DFA-SAFNAccuracy69.1Unverified
4EasyTLAccuracy63.3Unverified
5MEDAAccuracy60.3Unverified
#ModelMetricClaimedVerifiedStatus
1CNN10-20% Mask PSNR3.23Unverified
#ModelMetricClaimedVerifiedStatus
1Chatterjee, Dutta et al.[1]Accuracy96.12Unverified
#ModelMetricClaimedVerifiedStatus
1Co-TuningAccuracy85.65Unverified
#ModelMetricClaimedVerifiedStatus
1Physical AccessEER5.74Unverified
#ModelMetricClaimedVerifiedStatus
1riadd.aucmediAUROC0.95Unverified