A Riemannian rank-adaptive method for low-rank matrix completion
From MaRDI portal
Publication:2070331
DOI10.1007/S10589-021-00328-WzbMATH Open1484.90110arXiv2103.14768OpenAlexW3213035342WikidataQ115384040 ScholiaQ115384040MaRDI QIDQ2070331FDOQ2070331
Publication date: 24 January 2022
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Abstract: The low-rank matrix completion problem can be solved by Riemannian optimization on a fixed-rank manifold. However, a drawback of the known approaches is that the rank parameter has to be fixed a priori. In this paper, we consider the optimization problem on the set of bounded-rank matrices. We propose a Riemannian rank-adaptive method, which consists of fixed-rank optimization, rank increase step and rank reduction step. We explore its performance applied to the low-rank matrix completion problem. Numerical experiments on synthetic and real-world datasets illustrate that the proposed rank-adaptive method compares favorably with state-of-the-art algorithms. In addition, it shows that one can incorporate each aspect of this rank-adaptive framework separately into existing algorithms for the purpose of improving performance.
Full work available at URL: https://arxiv.org/abs/2103.14768
Recommendations
- Low-rank matrix completion by Riemannian optimization
- Riemannian gradient descent methods for graph-regularized matrix completion
- Robust low-rank matrix completion by Riemannian optimization
- Guarantees of Riemannian optimization for low rank matrix completion
- Low-rank matrix completion via preconditioned optimization on the Grassmann manifold
matrix completionlow-rankRiemannian optimizationfixed-rank manifoldbounded-rank matricesrank-adaptive
Cites Work
- Low-rank matrix completion via preconditioned optimization on the Grassmann manifold
- Title not available (Why is that?)
- Two-Point Step Size Gradient Methods
- Low-rank matrix completion by Riemannian optimization
- Low-Rank Optimization with Trace Norm Penalty
- Convergence Results for Projected Line-Search Methods on Varieties of Low-Rank Matrices Via Łojasiewicz Inequality
- The Riemannian Barzilai–Borwein method with nonmonotone line search and the matrix geometric mean computation
- Nonconvex Optimization Meets Low-Rank Matrix Factorization: An Overview
- Low rank matrix completion by alternating steepest descent methods
- Guarantees of Riemannian optimization for low rank matrix recovery
- Global rates of convergence for nonconvex optimization on manifolds
- Riemannian Optimization on the Symplectic Stiefel Manifold
- A brief introduction to manifold optimization
- Geometric Methods on Low-Rank Matrix and Tensor Manifolds
Cited In (11)
- An adaptation for iterative structured matrix completion
- Riemannian Conjugate Gradient Methods: General Framework and Specific Algorithms with Convergence Analyses
- A Riemannian Framework for Low-Rank Structured Elliptical Models
- A feasible method for general convex low-rank SDP problems
- Low-rank tensor methods for partial differential equations
- Stable Rank-Adaptive Dynamically Orthogonal Runge–Kutta Schemes
- Adaptive and Implicit Regularization for Matrix Completion
- T-product factorization based method for matrix and tensor completion problems
- Fully adaptive structure-preserving hyper-reduction of parametric Hamiltonian systems
- A universal rank approximation method for matrix completion
- Guarantees of Riemannian optimization for low rank matrix completion
Uses Software
This page was built for publication: A Riemannian rank-adaptive method for low-rank matrix completion
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2070331)