SOTAVerified

Can we learn gradients by Hamiltonian Neural Networks?

2021-10-31Code Available0· sign in to hype

Aleksandr Timofeev, Andrei Afonin, Yehao Liu

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

In this work, we propose a meta-learner based on ODE neural networks that learns gradients. This approach makes the optimizer is more flexible inducing an automatic inductive bias to the given task. Using the simplest Hamiltonian Neural Network we demonstrate that our method outperforms a meta-learner based on LSTM for an artificial task and the MNIST dataset with ReLU activations in the optimizee. Furthermore, it also surpasses the classic optimization methods for the artificial task and achieves comparable results for MNIST.

Tasks

Reproductions