Error Bound of Empirical _2 Risk Minimization for Noisy Standard and Generalized Phase Retrieval Problems
Junren Chen, Michael K. Ng
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
In this paper, we study the estimation performance of empirical _2 risk minimization (ERM) in noisy (standard) phase retrieval (NPR) given by y_k = |_k^*x_0|^2+_k, or noisy generalized phase retrieval (NGPR) formulated as y_k = x_0^*A_kx_0 + _k, where x_0K^d is the desired signal, n is the sample size, = (_1,...,_n)^ is the noise vector. We establish new error bounds under different noise patterns, and our proofs are valid for both K=R and K=C. In NPR under arbitrary noise vector , we derive a new error bound O(\|\|_dn + |1^|n), which is tighter than the currently known one O(\|\|n) in many cases. In NGPR, we show O(\|\|dn) for arbitrary . In both problems, the bounds for arbitrary noise immediately give rise to O(dn) for sub-Gaussian or sub-exponential random noise, with some conventional but inessential assumptions (e.g., independent or zero-mean condition) removed or weakened. In addition, we make a first attempt to ERM under heavy-tailed random noise assumed to have bounded l-th moment. To achieve a trade-off between bias and variance, we truncate the responses and propose a corresponding robust ERM estimator, which is shown to possess the guarantee O([dn]^1-1/l) in both NPR, NGPR. All the error bounds straightforwardly extend to the more general problems of rank-r matrix recovery, and these results deliver a conclusion that the full-rank frame _k\_k=1^n in NGPR is more robust to biased noise than the rank-1 frame \_k_k^*\_k=1^n in NPR. Extensive experimental results are presented to illustrate our theoretical findings.