SOTAVerified

Orthogonal Nonnegative Matrix Factorization with the Kullback-Leibler divergence

2024-10-10Code Available0· sign in to hype

Jean Pacifique Nkurunziza, Fulgence Nahayo, Nicolas Gillis

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Orthogonal nonnegative matrix factorization (ONMF) has become a standard approach for clustering. As far as we know, most works on ONMF rely on the Frobenius norm to assess the quality of the approximation. This paper presents a new model and algorithm for ONMF that minimizes the Kullback-Leibler (KL) divergence. As opposed to the Frobenius norm which assumes Gaussian noise, the KL divergence is the maximum likelihood estimator for Poisson-distributed data, which can model better sparse vectors of word counts in document data sets and photo counting processes in imaging. We develop an algorithm based on alternating optimization, KL-ONMF, and show that it performs favorably with the Frobenius-norm based ONMF for document classification and hyperspectral image unmixing.

Tasks

Reproductions