A Simple Algorithm For Scaling Up Kernel Methods
2023-01-26Code Available0· sign in to hype
Teng Andrea Xu, Bryan Kelly, Semyon Malamud
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/tengandreaxu/fabrOfficialnone★ 3
Abstract
The recent discovery of the equivalence between infinitely wide neural networks (NNs) in the lazy training regime and Neural Tangent Kernels (NTKs) (Jacot et al., 2018) has revived interest in kernel methods. However, conventional wisdom suggests kernel methods are unsuitable for large samples due to their computational complexity and memory requirements. We introduce a novel random feature regression algorithm that allows us (when necessary) to scale to virtually infinite numbers of random features. We illustrate the performance of our method on the CIFAR-10 dataset.