SOTAVerified

Super-fast rates of convergence for Neural Networks Classifiers under the Hard Margin Condition

2025-05-13Unverified0· sign in to hype

Nathanael Tepakbong, Ding-Xuan Zhou, Xiang Zhou

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We study the classical binary classification problem for hypothesis spaces of Deep Neural Networks (DNNs) with ReLU activation under Tsybakov's low-noise condition with exponent q>0, and its limit-case q which we refer to as the "hard-margin condition". We show that DNNs which minimize the empirical risk with square loss surrogate and _p penalty can achieve finite-sample excess risk bounds of order O(n^-) for arbitrarily large >0 under the hard-margin condition, provided that the regression function is sufficiently smooth. The proof relies on a novel decomposition of the excess risk which might be of independent interest.

Tasks

Reproductions