SOTAVerified

Adaptivity for Regularized Kernel Methods by Lepskii's Principle

2018-04-15Unverified0· sign in to hype

Nicole Mücke

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We address the problem of adaptivity in the framework of reproducing kernel Hilbert space (RKHS) regression. More precisely, we analyze estimators arising from a linear regularization scheme g_. In practical applications, an important task is to choose the regularization parameter appropriately, i.e. based only on the given data and independently on unknown structural assumptions on the regression function. An attractive approach avoiding data-splitting is the Lepskii Principle (LP), also known as the Balancing Principle is this setting. We show that a modified parameter choice based on (LP) is minimax optimal adaptive, up to (n). A convenient result is the fact that balancing in L^2()- norm, which is easiest, automatically gives optimal balancing in all stronger norms, interpolating between L^2() and the RKHS. An analogous result is open for other classical approaches to data dependent choices of the regularization parameter, e.g. for Hold-Out.

Tasks

Reproductions