SOTAVerified

Siamese Networks: The Tale of Two Manifolds

2019-10-01ICCV 2019Unverified0· sign in to hype

Soumava Kumar Roy, Mehrtash Harandi, Richard Nock, Richard Hartley

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Siamese networks are non-linear deep models that have found their ways into a broad set of problems in learning theory, thanks to their embedding capabilities. In this paper, we study Siamese networks from a new perspective and question the validity of their training procedure. We show that in the majority of cases, the objective of a Siamese network is endowed with an invariance property. Neglecting the invariance property leads to a hindrance in training the Siamese networks. To alleviate this issue, we propose two Riemannian structures and generalize a well-established accelerated stochastic gradient descent method to take into account the proposed Riemannian structures. Our empirical evaluations suggest that by making use of the Riemannian geometry, we achieve state-of-the-art results against several algorithms for the challenging problem of fine-grained image classification.

Tasks

Reproductions