SOTAVerified

Fast Robust Kernel Regression through Sign Gradient Descent with Early Stopping

2023-06-29Code Available0· sign in to hype

Oskar Allerbo

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Kernel ridge regression, KRR, is a generalization of linear ridge regression that is non-linear in the data, but linear in the model parameters. Here, we introduce an equivalent formulation of the objective function of KRR, which opens up for replacing the ridge penalty with the _ and _1 penalties. Using the _ and _1 penalties, we obtain robust and sparse kernel regression, respectively. We study the similarities between explicitly regularized kernel regression and the solutions obtained by early stopping of iterative gradient-based methods, where we connect _ regularization to sign gradient descent, _1 regularization to forward stagewise regression (also known as coordinate descent), and _2 regularization to gradient descent, and, in the last case, theoretically bound for the differences. We exploit the close relations between _ regularization and sign gradient descent, and between _1 regularization and coordinate descent to propose computationally efficient methods for robust and sparse kernel regression. We finally compare robust kernel regression through sign gradient descent to existing methods for robust kernel regression on five real data sets, demonstrating that our method is one to two orders of magnitude faster, without compromised accuracy.

Tasks

Reproductions