SOTAVerified

Convergence guarantees for forward gradient descent in the linear regression model

2023-09-26Unverified0· sign in to hype

Thijs Bos, Johannes Schmidt-Hieber

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Renewed interest in the relationship between artificial and biological neural networks motivates the study of gradient-free methods. Considering the linear regression model with random design, we theoretically analyze in this work the biologically motivated (weight-perturbed) forward gradient scheme that is based on random linear combination of the gradient. If d denotes the number of parameters and k the number of samples, we prove that the mean squared error of this method converges for k d^2(d) with rate d^2(d)/k. Compared to the dimension dependence d for stochastic gradient descent, an additional factor d(d) occurs.

Tasks

Reproductions