SOTAVerified

Multi-Task Learning

Multi-task learning aims to learn multiple different tasks simultaneously while maximizing performance on one or all of the tasks.

( Image credit: Cross-stitch Networks for Multi-task Learning )

Papers

Showing 20012025 of 3687 papers

TitleStatusHype
Three Birds with One Stone: Multi-Task Temporal Action Detection via Recycling Temporal Annotations0
Polygonal Building Extraction by Frame Field LearningCode1
Enhancing Question Generation with Commonsense Knowledge0
Practical Transferability Estimation for Image Classification Tasks0
Multi-Task Learning for User Engagement and Adoption in Live Video Streaming EventsCode0
Unsupervised Domain Adaptation for Dysarthric Speech Detection via Domain Adversarial Training and Mutual Information Minimization0
Multi-Task Learning and Adapted Knowledge Models for Emotion-Cause Extraction0
Bridging Multi-Task Learning and Meta-Learning: Towards Efficient Training and Effective AdaptationCode1
Minimizing Communication while Maximizing Performance in Multi-Agent Reinforcement Learning0
E2E-based Multi-task Learning Approach to Joint Speech and Accent Recognition0
Interpretable Self-supervised Multi-task Learning for COVID-19 Information Retrieval and Extraction0
Multi-script Handwritten Digit Recognition Using Multi-task Learning0
Poisoning Deep Reinforcement Learning Agents with In-Distribution Triggers0
HistoTransfer: Understanding Transfer Learning for Histopathology0
Anomalous Sound Detection Using a Binary Classification Model and Class Centroids0
Nested and Balanced Entity Recognition using Multi-Task Learning0
Meta-Adaptive Nonlinear Control: Theory and AlgorithmsCode1
Instance-Level Task Parameters: A Robust Multi-task Weighting Framework0
TASK AWARE MULTI-TASK LEARNING FOR SPEECH TO TEXT TASKS0
A Semi-supervised Multi-task Learning Approach to Classify Customer Contact Intents0
DUET: Detection Utilizing Enhancement for Text in Scanned or Captured Documents0
Thompson Sampling with a Mixture Prior0
Salient Object Ranking with Position-Preserved AttentionCode1
One Semantic Parser to Parse Them All: Sequence to Sequence Multi-Task Learning on Semantic Parsing Datasets0
Parameter-efficient Multi-task Fine-tuning for Transformers via Shared HypernetworksCode1
Show:102550
← PrevPage 81 of 148Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1PCGrad∆m%125.7Unverified
2CAGrad∆m%112.8Unverified
3IMTL-G∆m%77.2Unverified
4Nash-MTL∆m%62Unverified
5BayesAgg-MTL∆m%53.7Unverified
#ModelMetricClaimedVerifiedStatus
1SwinMTLmIoU76.41Unverified
2Nash-MTLmIoU75.41Unverified
3MultiObjectiveOptimizationmIoU66.63Unverified
#ModelMetricClaimedVerifiedStatus
1SwinMTLMean IoU58.14Unverified
2Nash-MTLMean IoU40.13Unverified
#ModelMetricClaimedVerifiedStatus
1Gumbel-Matrix RoutingAverage Accuracy93.52Unverified
2Mixture-of-ExpertsAverage Accuracy92.19Unverified
#ModelMetricClaimedVerifiedStatus
1MGDA-UBError8.25Unverified
#ModelMetricClaimedVerifiedStatus
1BayesAgg-MTLdelta_m-2.23Unverified
#ModelMetricClaimedVerifiedStatus
1LETRFH83.3Unverified