Gradient Kernel Regression
2021-04-13Unverified0· sign in to hype
Matt Calder
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
In this article a surprising result is demonstrated using the neural tangent kernel. This kernel is defined as the inner product of the vector of the gradient of an underlying model evaluated at training points. This kernel is used to perform kernel regression. The surprising thing is that the accuracy of that regression is independent of the accuracy of the underlying network.