SOTAVerified

Mirror descent of Hopfield model

2022-11-29Unverified0· sign in to hype

Hyungjoon Soh, Dongyeob Kim, Juno Hwang, Junghyo Jo

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Mirror descent is an elegant optimization technique that leverages a dual space of parametric models to perform gradient descent. While originally developed for convex optimization, it has increasingly been applied in the field of machine learning. In this study, we propose a novel approach for utilizing mirror descent to initialize the parameters of neural networks. Specifically, we demonstrate that by using the Hopfield model as a prototype for neural networks, mirror descent can effectively train the model with significantly improved performance compared to traditional gradient descent methods that rely on random parameter initialization. Our findings highlight the potential of mirror descent as a promising initialization technique for enhancing the optimization of machine learning models.

Tasks

Reproductions