Information-Computation Tradeoffs for Learning Margin Halfspaces with Random Classification Noise
Ilias Diakonikolas, Jelena Diakonikolas, Daniel M. Kane, Puqian Wang, Nikos Zarifis
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
We study the problem of PAC learning -margin halfspaces with Random Classification Noise. We establish an information-computation tradeoff suggesting an inherent gap between the sample complexity of the problem and the sample complexity of computationally efficient algorithms. Concretely, the sample complexity of the problem is (1/(^2 )). We start by giving a simple efficient algorithm with sample complexity O(1/(^2 ^2)). Our main result is a lower bound for Statistical Query (SQ) algorithms and low-degree polynomial tests suggesting that the quadratic dependence on 1/ in the sample complexity is inherent for computationally efficient algorithms. Specifically, our results imply a lower bound of (1/(^1/2 ^2)) on the sample complexity of any efficient SQ learner or low-degree test.