Numerous applications in data mining and machinelearning require recovering a matrix of minimal rank. Robust principal component analysis (RPCA) is a generalframework for handling this kind of problems. Nuclear normbased convex surrogate of the rank function in RPCA iswidely investigated. Under certain assumptions, it can recoverthe underlying true low rank matrix with high probability. However, those assumptions may not hold in real-world applications. Since the nuclear norm approximates the rank byadding all singular values together, which is essentially a '1-norm of the singular values, the resulting approximation erroris not trivial and thus the resulting matrix estimator canbe significantly biased. To seek a closer approximation andto alleviate the above-mentioned limitations of the nuclearnorm, we propose a nonconvex rank approximation. Thisapproximation to the matrix rank is tighter than the nuclearnorm. To solve the associated nonconvex minimization problem, we develop an efficient augmented Lagrange multiplier basedoptimization algorithm. Experimental results demonstrate thatour method outperforms current state-of-the-art algorithms inboth accuracy and efficiency.
|Title of host publication||Proceedings - 15th IEEE International Conference on Data Mining, ICDM 2015|
|Editors||Charu Aggarwal, Zhi-Hua Zhou, Alexander Tuzhilin, Hui Xiong, Xindong Wu|
|Number of pages||10|
|State||Published - Jan 5 2016|
|Event||15th IEEE International Conference on Data Mining, ICDM 2015 - Atlantic City, United States|
Duration: Nov 14 2015 → Nov 17 2015
|Name||Proceedings - IEEE International Conference on Data Mining, ICDM|
|Conference||15th IEEE International Conference on Data Mining, ICDM 2015|
|Period||11/14/15 → 11/17/15|
Bibliographical notePublisher Copyright:
© 2015 IEEE.
ASJC Scopus subject areas
- Engineering (all)