SOTAVerified

Phase transitions for high-dimensional joint support recovery

2008-12-01NeurIPS 2008Unverified0· sign in to hype

Sahand Negahban, Martin J. Wainwright

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We consider the following instance of transfer learning: given a pair of regression problems, suppose that the regression coefficients share a partially common support, parameterized by the overlap fraction between the two supports. This set-up suggests the use of 1, -regularized linear regression for recovering the support sets of both regression vectors. Our main contribution is to provide a sharp characterization of the sample complexity of this 1, relaxation, exactly pinning down the minimal sample size n required for joint support recovery as a function of the model dimension , support size and overlap [0,1]. For measurement matrices drawn from standard Gaussian ensembles, we prove that the joint 1,-regularized method undergoes a phase transition characterized by order parameter (, , , ) = (4 - 3 ) s (p-(2-)s). More precisely, the probability of successfully recovering both supports converges to 1 for scalings such that > 1, and converges to 0 to scalings for which < 1. An implication of this threshold is that use of 1, -regularization leads to gains in sample complexity if the overlap parameter is large enough ( > 2/3), but performs worse than a naive approach if < 2/3. We illustrate the close agreement between these theoretical predictions, and the actual behavior in simulations. Thus, our results illustrate both the benefits and dangers associated with block-1, regularization in high-dimensional inference.

Tasks

Reproductions