Agnostic learning in (almost) optimal time via Gaussian surface area
Lucas Pesenti, Lucas Slot, Manuel Wiedmer
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
The complexity of learning a concept class under Gaussian marginals in the difficult agnostic model is closely related to its L_1-approximability by low-degree polynomials. For any concept class with Gaussian surface area at most Γ, Klivans et al. (2008) show that degree d = O(Γ^2 / ^4) suffices to achieve an -approximation. This leads to the best-known bounds on the complexity of learning a variety of concept classes. In this note, we improve their analysis by showing that degree d = O (Γ^2 / ^2) is enough. In light of lower bounds due to Diakonikolas et al. (2021), this yields (near) optimal bounds on the complexity of agnostically learning polynomial threshold functions in the statistical query model. Our proof relies on a direct analogue of a construction of Feldman et al. (2020), who considered L_1-approximation on the Boolean hypercube.