SOTAVerified

Low-tubal-rank Tensor Completion using Alternating Minimization

2016-10-05Unverified0· sign in to hype

Xiao-Yang Liu, Shuchin Aeron, Vaneet Aggarwal, Xiaodong Wang

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

The low-tubal-rank tensor model has been recently proposed for real-world multidimensional data. In this paper, we study the low-tubal-rank tensor completion problem, i.e., to recover a third-order tensor by observing a subset of its elements selected uniformly at random. We propose a fast iterative algorithm, called Tubal-Alt-Min, that is inspired by a similar approach for low-rank matrix completion. The unknown low-tubal-rank tensor is represented as the product of two much smaller tensors with the low-tubal-rank property being automatically incorporated, and Tubal-Alt-Min alternates between estimating those two tensors using tensor least squares minimization. First, we note that tensor least squares minimization is different from its matrix counterpart and nontrivial as the circular convolution operator of the low-tubal-rank tensor model is intertwined with the sub-sampling operator. Second, the theoretical performance guarantee is challenging since Tubal-Alt-Min is iterative and nonconvex in nature. We prove that 1) Tubal-Alt-Min guarantees exponential convergence to the global optima, and 2) for an n n k tensor with tubal-rank r n, the required sampling complexity is O(nr^2k ^3 n) and the computational complexity is O(n^2rk^2 ^2 n). Third, on both synthetic data and real-world video data, evaluation results show that compared with tensor-nuclear norm minimization (TNN-ADMM), Tubal-Alt-Min improves the recovery error dramatically (by orders of magnitude). It is estimated that Tubal-Alt-Min converges at an exponential rate 10^-0.4423 Iter where Iter denotes the number of iterations, which is much faster than TNN-ADMM's 10^-0.0332 Iter, and the running time can be accelerated by more than 5 times for a 200 200 20 tensor.

Tasks

Reproductions