SOTAVerified

Robust learning of halfspaces under log-concave marginals

2025-05-19Unverified0· sign in to hype

Jane Lange, Arsen Vasilyan

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We say that a classifier is adversarially robust to perturbations of norm r if, with high probability over a point x drawn from the input distribution, there is no point within distance r from x that is classified differently. The boundary volume is the probability that a point falls within distance r of a point with a different label. This work studies the task of computationally efficient learning of hypotheses with small boundary volume, where the input is distributed as a subgaussian isotropic log-concave distribution over R^d. Linear threshold functions are adversarially robust; they have boundary volume proportional to r. Such concept classes are efficiently learnable by polynomial regression, which produces a polynomial threshold function (PTF), but PTFs in general may have boundary volume (1), even for r 1. We give an algorithm that agnostically learns linear threshold functions and returns a classifier with boundary volume O(r+) at radius of perturbation r. The time and sample complexity of d^O(1/^2) matches the complexity of polynomial regression. Our algorithm augments the classic approach of polynomial regression with three additional steps: a) performing the _1-error regression under noise sensitivity constraints, b) a structured partitioning and rounding step that returns a Boolean classifier with error opt + O() and noise sensitivity O(r+) simultaneously, and c) a local corrector that ``smooths'' a function with low noise sensitivity into a function that is adversarially robust.

Tasks

Reproductions