Globally Convergent Accelerated Algorithms for Multilinear Sparse Logistic Regression with _0-constraints
Weifeng Yang, Wenwen Min
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
Tensor data represents a multidimensional array. Regression methods based on low-rank tensor decomposition leverage structural information to reduce the parameter count. Multilinear logistic regression serves as a powerful tool for the analysis of multidimensional data. To improve its efficacy and interpretability, we present a Multilinear Sparse Logistic Regression model with _0-constraints (_0-MLSR). In contrast to the _1-norm and _2-norm, the _0-norm constraint is better suited for feature selection. However, due to its nonconvex and nonsmooth properties, solving it is challenging and convergence guarantees are lacking. Additionally, the multilinear operation in _0-MLSR also brings non-convexity. To tackle these challenges, we propose an Accelerated Proximal Alternating Linearized Minimization with Adaptive Momentum (APALM^+) method to solve the _0-MLSR model. We provide a proof that APALM^+ can ensure the convergence of the objective function of _0-MLSR. We also demonstrate that APALM^+ is globally convergent to a first-order critical point as well as establish convergence rate by using the Kurdyka-Lojasiewicz property. Empirical results obtained from synthetic and real-world datasets validate the superior performance of our algorithm in terms of both accuracy and speed compared to other state-of-the-art methods.