SOTAVerified

Gradient Weights help Nonparametric Regressors

2012-12-01NeurIPS 2012Unverified0· sign in to hype

Samory Kpotufe, Abdeslam Boularias

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

In regression problems over ^d, the unknown function f often varies more in some coordinates than in others. We show that weighting each coordinate i with the estimated norm of the ith derivative of f is an efficient way to significantly improve the performance of distance-based regressors, e.g. kernel and k-NN regressors. We propose a simple estimator of these derivative norms and prove its consistency. Moreover, the proposed estimator is efficiently learned online.

Tasks

Reproductions