SOTAVerified

K-Deep Simplex: Deep Manifold Learning via Local Dictionaries

2020-12-03Code Available0· sign in to hype

Pranay Tankala, Abiy Tasissa, James M. Murphy, Demba Ba

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We propose K-Deep Simplex(KDS) which, given a set of data points, learns a dictionary comprising synthetic landmarks, along with representation coefficients supported on a simplex. KDS employs a local weighted _1 penalty that encourages each data point to represent itself as a convex combination of nearby landmarks. We solve the proposed optimization program using alternating minimization and design an efficient, interpretable autoencoder using algorithm unrolling. We theoretically analyze the proposed program by relating the weighted _1 penalty in KDS to a weighted _0 program. Assuming that the data are generated from a Delaunay triangulation, we prove the equivalence of the weighted _1 and weighted _0 programs. We further show the stability of the representation coefficients under mild geometrical assumptions. If the representation coefficients are fixed, we prove that the sub-problem of minimizing over the dictionary yields a unique solution. Further, we show that low-dimensional representations can be efficiently obtained from the covariance of the coefficient matrix. Experiments show that the algorithm is highly efficient and performs competitively on synthetic and real data sets.

Tasks

Reproductions