SOTAVerified

Adaptive Step Sizes for Preconditioned Stochastic Gradient Descent

2023-11-28Unverified0· sign in to hype

Frederik Köhne, Leonie Kreis, Anton Schiela, Roland Herzog

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

This paper proposes a novel approach to adaptive step sizes in stochastic gradient descent (SGD) by utilizing quantities that we have identified as numerically traceable -- the Lipschitz constant for gradients and a concept of the local variance in search directions. Our findings yield a nearly hyperparameter-free algorithm for stochastic optimization, which has provable convergence properties and exhibits truly problem adaptive behavior on classical image classification tasks. Our framework is set in a general Hilbert space and thus enables the potential inclusion of a preconditioner through the choice of the inner product.

Tasks

Reproductions