Low-rank matrix completion by Riemannian optimization
From MaRDI portal
Publication:2848192
Abstract: The matrix completion problem consists of finding or approximating a low-rank matrix based on a few samples of this matrix. We propose a new algorithm for matrix completion that minimizes the least-square distance on the sampling set over the Riemannian manifold of fixed-rank matrices. The algorithm is an adaptation of classical non-linear conjugate gradients, developed within the framework of retraction-based optimization on manifolds. We describe all the necessary objects from differential geometry necessary to perform optimization over this low-rank matrix manifold, seen as a submanifold embedded in the space of matrices. In particular, we describe how metric projection can be used as retraction and how vector transport lets us obtain the conjugate search directions. Finally, we prove convergence of a regularized version of our algorithm under the assumption that the restricted isometry property holds for incoherent matrices throughout the iterations. The numerical experiments indicate that our approach scales very well for large-scale problems and compares favorably with the state-of-the-art, while outperforming most existing solvers.
Recommendations
- Guarantees of Riemannian optimization for low rank matrix completion
- A Riemannian rank-adaptive method for low-rank matrix completion
- Low-rank matrix completion via preconditioned optimization on the Grassmann manifold
- Robust low-rank matrix completion by Riemannian optimization
- Riemannian conjugate gradient method for low-rank tensor completion
Cited in
(only showing first 100 items - show all)- Riemannian optimization via Frank-Wolfe methods
- Desingularization of bounded-rank matrix sets
- Guarantees of Riemannian optimization for low rank matrix completion
- A geometric approach to dynamical model order reduction
- Analysis of asymptotic escape of strict saddle sets in manifold optimization
- A Riemannian subspace BFGS trust region method
- Jacobi-Davidson method on low-rank matrix manifolds
- Hybrid Riemannian conjugate gradient methods with global convergence properties
- Low-rank tensor completion by Riemannian optimization
- Reconstruction of jointly sparse vectors via manifold optimization
- Low-rank optimization with trace norm penalty
- Intrinsic representation of tangent vectors and vector transports on matrix manifolds
- Nonlinear matrix recovery using optimization on the Grassmann manifold
- Hierarchical compressed sensing
- Sufficient descent Riemannian conjugate gradient methods
- Preconditioned low-rank Riemannian optimization for linear systems with tensor product structure
- Low-rank matrix completion via preconditioned optimization on the Grassmann manifold
- Low-rank Riemannian eigensolver for high-dimensional Hamiltonians
- -subgradient algorithms for locally Lipschitz functions on Riemannian manifolds
- Toeplitz matrix completion via smoothing augmented Lagrange multiplier algorithm
- Guarantees of Riemannian optimization for low rank matrix recovery
- Some empirical advances in matrix completion
- An image inpainting algorithm using exemplar matching and low-rank sparse prior
- A Riemannian Framework for Low-Rank Structured Elliptical Models
- A semi-smoothing augmented Lagrange multiplier algorithm for low-rank Toeplitz matrix completion
- Toeplitz matrix completion via a low-rank approximation algorithm
- Computing eigenspaces with low rank constraints
- The two-stage iteration algorithms based on the shortest distance for low-rank matrix completion
- Riemannian optimization for high-dimensional tensor completion
- Convergence results for projected line-search methods on varieties of low-rank matrices via Łojasiewicz inequality
- A quadratically convergent algorithm for structured low-rank approximation
- Sensitivity of low-rank matrix recovery
- A penalty method for rank minimization problems in symmetric matrices
- Automatic differentiation for Riemannian optimization on low-rank matrix and tensor-train manifolds
- The Extrinsic Geometry of Dynamical Systems Tracking Nonlinear Matrix Projections
- From low-rank retractions to dynamical low-rank approximation and back
- An improved Riemannian conjugate gradient method and its application to robust matrix completion
- Dynamically Orthogonal Runge–Kutta Schemes with Perturbative Retractions for the Dynamical Low-Rank Approximation
- Efficient Weingarten map and curvature estimation on manifolds
- Noisy matrix completion: understanding statistical guarantees for convex relaxation via nonconvex optimization
- Accurate and fast matrix factorization for low-rank learning.
- Proximal linearization methods for Schatten \(p\)-quasi-norm minimization
- Fast gradient method for low-rank matrix estimation
- Accelerated Alternating Projections for Robust Principal Component Analysis
- One-bit tensor completion via transformed tensor singular value decomposition
- Stochastic variance reduced gradient for affine rank minimization problem
- Riemannian preconditioning
- Alternating least squares as moving subspace correction
- Riemannian optimization for phase retrieval from masked Fourier measurements
- Rank $2r$ Iterative Least Squares: Efficient Recovery of Ill-Conditioned Low Rank Matrices from Few Entries
- A Riemannian BFGS method without differentiated retraction for nonconvex optimization problems
- A gradient sampling method on algebraic varieties and application to nonsmooth low-rank optimization
- Riemannian conjugate gradient methods with inverse retraction
- Implicit low-rank Riemannian schemes for the time integration of stiff partial differential equations
- Reduction of nonlinear embedded boundary models for problems with evolving interfaces
- Continuation methods for Riemannian optimization
- Modified memoryless spectral-scaling Broyden family on Riemannian manifolds
- Online learning in the embedded manifold of low-rank matrices
- Multi-dimensional scaling from \(K\)-nearest neighbourhood distances
- A brief introduction to manifold optimization
- A nonmonotone trust region method for unconstrained optimization problems on Riemannian manifolds
- Riemannian gradient descent methods for graph-regularized matrix completion
- Fast Cadzow's algorithm and a gradient variant
- A limited-memory Riemannian symmetric rank-one trust-region method with a restart strategy
- AN EFFICIENT METHOD FOR SOLVING A CLASS OF MATRIX TRACE FUNCTION MINIMIZATION PROBLEM IN MULTIVARIATE STATISTICAL
- GNMR: a provable one-line algorithm for low rank matrix recovery
- Differentiable piecewise-Bézier surfaces on Riemannian manifolds
- Solving systems of phaseless equations via Riemannian optimization with optimal sampling complexity
- Matrix completion for matrices with low-rank displacement
- Optimization on flag manifolds
- Adaptive quadratically regularized Newton method for Riemannian optimization
- Nonnegative low rank matrix approximation for nonnegative matrices
- Fixed-rank matrix factorizations and Riemannian low-rank optimization
- A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds
- Riemannian conjugate gradient method for low-rank tensor completion
- A gradient system for low rank matrix completion
- A kind of ill-posed inverse problem solving with sparsity constraint
- Low-rank nonnegative matrix factorization on Stiefel manifold
- Survey on matrix completion models and algorithms
- Optimization on matrix manifold based on gradient information and its applications in network control
- Low rank tensor recovery via iterative hard thresholding
- Robust low-rank matrix completion by Riemannian optimization
- A nonlocal low-rank regularization method for fractal image coding
- Fenchel Duality and a Separation Theorem on Hadamard Manifolds
- Nonnegative Low Rank Matrix Completion by Riemannian Optimalization Methods
- An efficient algorithm for solving a class of matrix optimization problem in scalable probabilistic approximation
- An efficient method for non-negative low-rank completion
- A Broyden class of quasi-Newton methods for Riemannian optimization
- Memoryless quasi-Newton methods based on the spectral-scaling Broyden family for Riemannian optimization
- An ADMM-factorization algorithm for low rank matrix completion
- Low-rank matrix completion in a general non-orthogonal basis
- Harmonic mean iteratively reweighted least squares for low-rank matrix recovery
- An equivalence between critical points for rank constraints versus low-rank factorizations
- Robust PCA by manifold optimization
- Riemannian trust region methods for \(\mathrm{SC}^1\) minimization
- Robust recovery of Robinson property in \(L^p\)-graphons: a cut-norm approach
- Riemannian thresholding methods for row-sparse and low-rank matrix recovery
- Low rank matrix completion by alternating steepest descent methods
- An efficient damped Newton-type algorithm with globalization strategy on Riemannian manifolds
- Preserving Lagrangian structure in nonlinear model reduction with application to structural dynamics
This page was built for publication: Low-rank matrix completion by Riemannian optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2848192)