SOTAVerified

Automatic differentiation for Riemannian optimization on low-rank matrix and tensor-train manifolds

2021-03-27Unverified0· sign in to hype

Alexander Novikov, Maxim Rakhuba, Ivan Oseledets

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

In scientific computing and machine learning applications, matrices and more general multidimensional arrays (tensors) can often be approximated with the help of low-rank decompositions. Since matrices and tensors of fixed rank form smooth Riemannian manifolds, one of the popular tools for finding low-rank approximations is to use Riemannian optimization. Nevertheless, efficient implementation of Riemannian gradients and Hessians, required in Riemannian optimization algorithms, can be a nontrivial task in practice. Moreover, in some cases, analytic formulas are not even available. In this paper, we build upon automatic differentiation and propose a method that, given an implementation of the function to be minimized, efficiently computes Riemannian gradients and matrix-by-vector products between an approximate Riemannian Hessian and a given vector.

Tasks

Reproductions