SOTAVerified

Compressed Dictionary Learning

2018-05-02Unverified0· sign in to hype

Karin Schnass, Flavio Teixeira

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

In this paper we show that the computational complexity of the Iterative Thresholding and K-residual-Means (ITKrM) algorithm for dictionary learning can be significantly reduced by using dimensionality-reduction techniques based on the Johnson-Lindenstrauss lemma. The dimensionality reduction is efficiently carried out with the fast Fourier transform. We introduce the Iterative compressed-Thresholding and K-Means (IcTKM) algorithm for fast dictionary learning and study its convergence properties. We show that IcTKM can locally recover an incoherent, overcomplete generating dictionary of K atoms from training signals of sparsity level S with high probability. Fast dictionary learning is achieved by embedding the training data and the dictionary into m < d dimensions, and recovery is shown to be locally stable with an embedding dimension which scales as low as m = O(S ^4 S ^3 K). The compression effectively shatters the data dimension bottleneck in the computational cost of ITKrM, reducing it by a factor O(m/d). Our theoretical results are complemented with numerical simulations which demonstrate that IcTKM is a powerful, low-cost algorithm for learning dictionaries from high-dimensional data sets.

Tasks

Reproductions