SOTAVerified

Fast, smooth and adaptive regression in metric spaces

2009-12-01NeurIPS 2009Unverified0· sign in to hype

Samory Kpotufe

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

It was recently shown that certain nonparametric regressors can escape the curse of dimensionality in the sense that their convergence rates adapt to the intrinsic dimension of data (BL:65, SK:77). We prove some stronger results in more general settings. In particular, we consider a regressor which, by combining aspects of both tree-based regression and kernel regression, operates on a general metric space, yields a smooth function, and evaluates in time O( n). We derive a tight convergence rate of the form n^-2/(2+d) where d is the Assouad dimension of the input space.

Tasks

Reproductions