Two Tales of L1 Variants for Sparse Signal and Low-rank Tensor Recovery
November 8,
-
Speaker(s):Yifei Liu, Associate Professor, School of Data Science and Society, The University of North Carolina
Regularization plays a pivotal role in tackling challenging ill-posed problems by guiding solutions towards desired properties. This talk covers three key problems---sparse signal recovery, low-rank matrix completion, and robust principle component analysis (RPCA) for tensors---all of which are linked to the $L_0$ regularization. As the $L_0$ model is challenging to work with directly, I will introduce two variants of the popular $L_1$ norm for approximating the $L_0$ regularization: one known as the transformed $L_1$ (TL1), and the other as the ratio of the $L_1$ and $L_2$ norms, denoted by $L_1/L_2$. Our theoretical analysis establishes the local optimality of the $L_1/L_2$ models based on the conditions that are analogous to those for the convex $L_1$ and nuclear norm methods. Additionally, we provide a statistical analysis of the estimator derived from a TL1-regularized matrix completion under a general sampling distribution, rather than the classic uniform sampling. Our results show that the model achieves a convergence rate, comparable to that of the nuclear norm-based model, despite the challenges posed by the non-convexity of TL1. In this talk, I will also present a range of real-world applications, including limited-angle CT reconstruction and video background modeling, demonstrating the superior performance of these two $L_1$ variants compared to state-of-the-art methods.