SOTAVerified

Multitask Learning via Shared Features: Algorithms and Hardness

2022-09-07Unverified0· sign in to hype

Konstantina Bairaktari, Guy Blanc, Li-Yang Tan, Jonathan Ullman, Lydia Zakynthinou

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We investigate the computational efficiency of multitask learning of Boolean functions over the d-dimensional hypercube, that are related by means of a feature representation of size k d shared across all tasks. We present a polynomial time multitask learning algorithm for the concept class of halfspaces with margin , which is based on a simultaneous boosting technique and requires only poly(k/) samples-per-task and poly(k(d)/) samples in total. In addition, we prove a computational separation, showing that assuming there exists a concept class that cannot be learned in the attribute-efficient model, we can construct another concept class such that can be learned in the attribute-efficient model, but cannot be multitask learned efficiently -- multitask learning this concept class either requires super-polynomial time complexity or a much larger total number of samples.

Tasks

Reproductions