Characterizing Evolution in Expectation-Maximization Estimates for Overspecified Mixed Linear Regression
Zhankun Luo, Abolfazl Hashemi
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
Mixture models have attracted significant attention due to practical effectiveness and comprehensive theoretical foundations. A persisting challenge is model misspecification, which occurs when the model to be fitted has more mixture components than those in the data distribution. In this paper, we develop a theoretical understanding of the Expectation-Maximization (EM) algorithm's behavior in the context of targeted model misspecification for overspecified two-component Mixed Linear Regression (2MLR) with unknown d-dimensional regression parameters and mixing weights. In Theorem 5.1 at the population level, with an unbalanced initial guess for mixing weights, we establish linear convergence of regression parameters in O((1/ε)) steps. Conversely, with a balanced initial guess for mixing weights, we observe sublinear convergence in O(ε^-2) steps to achieve the ε-accuracy at Euclidean distance. In Theorem 6.1 at the finite-sample level, for mixtures with sufficiently unbalanced fixed mixing weights, we demonstrate a statistical accuracy of O((d/n)^1/2), whereas for those with sufficiently balanced fixed mixing weights, the accuracy is O((d/n)^1/4) given n data samples. Furthermore, we underscore the connection between our population level and finite-sample level results: by setting the desired final accuracy ε in Theorem 5.1 to match that in Theorem 6.1 at the finite-sample level, namely letting ε= O((d/n)^1/2) for sufficiently unbalanced fixed mixing weights and ε= O((d/n)^1/4) for sufficiently balanced fixed mixing weights, we intuitively derive iteration complexity bounds O( (1/ε))=O( (n/d)) and O(ε^-2)=O((n/d)^1/2) at the finite-sample level for sufficiently unbalanced and balanced initial mixing weights. We further extend our analysis in overspecified setting to low SNR regime.