SOTAVerified

Learning Noisy Halfspaces with a Margin: Massart is No Harder than Random

2025-01-16Unverified0· sign in to hype

Gautam Chandrasekaran, Vasilis Kontonis, Konstantinos Stavropoulos, Kevin Tian

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We study the problem of PAC learning -margin halfspaces with Massart noise. We propose a simple proper learning algorithm, the Perspectron, that has sample complexity O(()^-2) and achieves classification error at most + where is the Massart noise rate. Prior works [DGT19,CKMY20] came with worse sample complexity guarantees (in both and ) or could only handle random classification noise [DDK+23,KIT+23] -- a much milder noise assumption. We also show that our results extend to the more challenging setting of learning generalized linear models with a known link function under Massart noise, achieving a similar sample complexity to the halfspace case. This significantly improves upon the prior state-of-the-art in this setting due to [CKMY20], who introduced this model.

Tasks

Reproductions