SOTAVerified

Expectation-maximization for logistic regression

2013-05-31Unverified0· sign in to hype

James G. Scott, Liang Sun

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We present a family of expectation-maximization (EM) algorithms for binary and negative-binomial logistic regression, drawing a sharp connection with the variational-Bayes algorithm of Jaakkola and Jordan (2000). Indeed, our results allow a version of this variational-Bayes approach to be re-interpreted as a true EM algorithm. We study several interesting features of the algorithm, and of this previously unrecognized connection with variational Bayes. We also generalize the approach to sparsity-promoting priors, and to an online method whose convergence properties are easily established. This latter method compares favorably with stochastic-gradient descent in situations with marked collinearity.

Tasks

Reproductions