SOTAVerified

Deep Multitask Learning with Progressive Parameter Sharing

2023-01-01ICCV 2023Unverified0· sign in to hype

Haosen Shi, Shen Ren, Tianwei Zhang, Sinno Jialin Pan

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We propose a novel progressive parameter-sharing strategy (MPPS) in this paper for effectively training multitask learning models on diverse computer vision tasks simultaneously. Specifically, we propose to parameterize distributions for different tasks to control the sharings, based on the concept of Exclusive Capacity that we introduce. A scheduling mechanism following the concept of curriculum learning is also designed to progressively change the sharing strategy to increase the level of sharing during the learning process. We further propose a novel loss function to regularize the optimization of network parameters as well as the sharing probabilities of each neuron for each task. Our approach can be combined with many state-of-the-art multitask learning solutions to achieve better joint task performance. Comprehensive experiments show that it has competitive performance on three challenging datasets (Multi-CIFAR100, NYUv2, and Cityscapes) using various convolution neural network architectures.

Tasks

Reproductions