SOTAVerified

Non-parametric Binary regression in metric spaces with KL loss

2020-10-19Unverified0· sign in to hype

Ariel Avital, Klim Efremenko, Aryeh Kontorovich, David Toplin, Bo Waggoner

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We propose a non-parametric variant of binary regression, where the hypothesis is regularized to be a Lipschitz function taking a metric space to [0,1] and the loss is logarithmic. This setting presents novel computational and statistical challenges. On the computational front, we derive a novel efficient optimization algorithm based on interior point methods; an attractive feature is that it is parameter-free (i.e., does not require tuning an update step size). On the statistical front, the unbounded loss function presents a problem for classic generalization bounds, based on covering-number and Rademacher techniques. We get around this challenge via an adaptive truncation approach, and also present a lower bound indicating that the truncation is, in some sense, necessary.

Tasks

Reproductions