SOTAVerified

Graph Metric Learning via Gershgorin Disc Alignment

2020-01-28Unverified0· sign in to hype

Cheng Yang, Gene Cheung, Wei Hu

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We propose a fast general projection-free metric learning framework, where the minimization objective _M S Q(M) is a convex differentiable function of the metric matrix M, and M resides in the set S of generalized graph Laplacian matrices for connected graphs with positive edge weights and node degrees. Unlike low-rank metric matrices common in the literature, S includes the important positive-diagonal-only matrices as a special case in the limit. The key idea for fast optimization is to rewrite the positive definite cone constraint in S as signal-adaptive linear constraints via Gershgorin disc alignment, so that the alternating optimization of the diagonal and off-diagonal terms in M can be solved efficiently as linear programs via Frank-Wolfe iterations. We prove that the Gershgorin discs can be aligned perfectly using the first eigenvector v of M, which we update iteratively using Locally Optimal Block Preconditioned Conjugate Gradient (LOBPCG) with warm start as diagonal / off-diagonal terms are optimized. Experiments show that our efficiently computed graph metric matrices outperform metrics learned using competing methods in terms of classification tasks.

Tasks

Reproductions