Fast Computation of Leave-One-Out Cross-Validation for k-NN Regression
2024-05-08Unverified0· sign in to hype
Motonobu Kanagawa
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
We describe a fast computation method for leave-one-out cross-validation (LOOCV) for k-nearest neighbours (k-NN) regression. We show that, under a tie-breaking condition for nearest neighbours, the LOOCV estimate of the mean square error for k-NN regression is identical to the mean square error of (k+1)-NN regression evaluated on the training data, multiplied by the scaling factor (k+1)^2/k^2. Therefore, to compute the LOOCV score, one only needs to fit (k+1)-NN regression only once, and does not need to repeat training-validation of k-NN regression for the number of training data. Numerical experiments confirm the validity of the fast computation method.