SOTAVerified

Lower bounds on minimax rates for nonparametric regression with additive sparsity and smoothness

2009-12-01NeurIPS 2009Unverified0· sign in to hype

Garvesh Raskutti, Bin Yu, Martin J. Wainwright

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

This paper uses information-theoretic techniques to determine minimax rates for estimating nonparametric sparse additive regression models under high-dimensional scaling. We assume an additive decomposition of the form f^*(X_1, , X_p) = _j S h_j(X_j), where each component function h_j lies in some Hilbert Space and S \1, , \ is an unknown subset with cardinality = |S. Given i.i.d. observations of f^*(X) corrupted with white Gaussian noise where the covariate vectors (X_1, X_2, X_3,...,X_) are drawn with i.i.d. components from some distribution , we determine tight lower bounds on the minimax rate for estimating the regression function with respect to squared error. The main result shows that the minimax rates are ( / n, ). The first term reflects the difficulty of performing subset selection and is independent of the Hilbert space ; the second term is an -dimensional estimation term, depending only on the low dimension but not the ambient dimension , that captures the difficulty of estimating a sum of univariate functions in the Hilbert space . As a special case, if corresponds to the -th order Sobolev space of functions that are m-times differentiable, the -dimensional estimation term takes the form \; n^-2/(2+1). The minimax rates are compared with rates achieved by an _1-penalty based approach, it can be shown that a certain _1-based approach achieves the minimax optimal rate.

Tasks

Reproductions