Global Optimality in Low-Rank Matrix Optimization

From MaRDI portal
Publication:4622249

DOI10.1109/TSP.2018.2835403zbMATH Open1414.90297arXiv1702.07945WikidataQ129829534 ScholiaQ129829534MaRDI QIDQ4622249FDOQ4622249


Authors: Zhihui Zhu, Qiuwei Li, Gongguo Tang, Michael B. Wakin Edit this on Wikidata


Publication date: 12 February 2019

Published in: IEEE Transactions on Signal Processing (Search for Journal in Brave)

Abstract: This paper considers the minimization of a general objective function f(X) over the set of rectangular nimesm matrices that have rank at most r. To reduce the computational burden, we factorize the variable X into a product of two smaller matrices and optimize over these two matrices instead of X. Despite the resulting nonconvexity, recent studies in matrix completion and sensing have shown that the factored problem has no spurious local minima and obeys the so-called strict saddle property (the function has a directional negative curvature at all critical points but local minima). We analyze the global geometry for a general and yet well-conditioned objective function f(X) whose restricted strong convexity and restricted strong smoothness constants are comparable. In particular, we show that the reformulated objective function has no spurious local minima and obeys the strict saddle property. These geometric properties imply that a number of iterative optimization algorithms (such as gradient descent) can provably solve the factored problem with global convergence.


Full work available at URL: https://arxiv.org/abs/1702.07945







Cited In (20)





This page was built for publication: Global Optimality in Low-Rank Matrix Optimization

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4622249)