Global Optimality in Low-Rank Matrix Optimization
From MaRDI portal
Publication:4622249
DOI10.1109/TSP.2018.2835403zbMATH Open1414.90297arXiv1702.07945WikidataQ129829534 ScholiaQ129829534MaRDI QIDQ4622249FDOQ4622249
Authors: Zhihui Zhu, Qiuwei Li, Gongguo Tang, Michael B. Wakin
Publication date: 12 February 2019
Published in: IEEE Transactions on Signal Processing (Search for Journal in Brave)
Abstract: This paper considers the minimization of a general objective function over the set of rectangular matrices that have rank at most . To reduce the computational burden, we factorize the variable into a product of two smaller matrices and optimize over these two matrices instead of . Despite the resulting nonconvexity, recent studies in matrix completion and sensing have shown that the factored problem has no spurious local minima and obeys the so-called strict saddle property (the function has a directional negative curvature at all critical points but local minima). We analyze the global geometry for a general and yet well-conditioned objective function whose restricted strong convexity and restricted strong smoothness constants are comparable. In particular, we show that the reformulated objective function has no spurious local minima and obeys the strict saddle property. These geometric properties imply that a number of iterative optimization algorithms (such as gradient descent) can provably solve the factored problem with global convergence.
Full work available at URL: https://arxiv.org/abs/1702.07945
Cited In (20)
- The global optimization geometry of shallow linear neural networks
- Title not available (Why is that?)
- Recovery of simultaneous low rank and two-way sparse coefficient matrices, a nonconvex approach
- Global optimization for sparse solution of least squares problems
- Rank-constrained fundamental matrix estimation by polynomial global optimization versus the eight-point algorithm
- Role of sparsity and structure in the optimization landscape of non-convex matrix sensing
- Certifying the Absence of Spurious Local Minima at Infinity
- Column \(\ell_{2,0}\)-norm regularized factorization model of low-rank matrix recovery and its computation
- A new complexity metric for nonconvex rank-one generalized matrix completion
- Spectrally constrained optimization
- Matrix completion via minimizing an approximate rank
- GNMR: a provable one-line algorithm for low rank matrix recovery
- Provable accelerated gradient method for nonconvex low rank optimization
- Exact guarantees on the absence of spurious local minima for non-negative rank-1 robust principal component analysis
- Low rank matrix recovery with adversarial sparse noise
- An equivalence between critical points for rank constraints versus low-rank factorizations
- An Unbiased Approach to Low Rank Recovery
- Error bound of critical points and KL property of exponent 1/2 for squared F-norm regularized factorization
- Finding stationary points on bounded-rank matrices: a geometric hurdle and a smooth remedy
- Nonconvex Robust Low-Rank Matrix Recovery
This page was built for publication: Global Optimality in Low-Rank Matrix Optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4622249)