SOTAVerified

Convex Regression in Multidimensions: Suboptimality of Least Squares Estimators

2020-06-03Unverified0· sign in to hype

Gil Kur, Fuchang Gao, Adityanand Guntuboyina, Bodhisattva Sen

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Under the usual nonparametric regression model with Gaussian errors, Least Squares Estimators (LSEs) over natural subclasses of convex functions are shown to be suboptimal for estimating a d-dimensional convex function in squared error loss when the dimension d is 5 or larger. The specific function classes considered include: (i) bounded convex functions supported on a polytope (in random design), (ii) Lipschitz convex functions supported on any convex domain (in random design), (iii) convex functions supported on a polytope (in fixed design). For each of these classes, the risk of the LSE is proved to be of the order n^-2/d (up to logarithmic factors) while the minimax risk is n^-4/(d+4), when d 5. In addition, the first rate of convergence results (worst case and adaptive) for the unrestricted convex LSE are established in fixed-design for polytopal domains for all d 1. Some new metric entropy results for convex functions are also proved which are of independent interest.

Tasks

Reproductions