Low-rank matrix completion by Riemannian optimization
From MaRDI portal
Publication:2848192
Abstract: The matrix completion problem consists of finding or approximating a low-rank matrix based on a few samples of this matrix. We propose a new algorithm for matrix completion that minimizes the least-square distance on the sampling set over the Riemannian manifold of fixed-rank matrices. The algorithm is an adaptation of classical non-linear conjugate gradients, developed within the framework of retraction-based optimization on manifolds. We describe all the necessary objects from differential geometry necessary to perform optimization over this low-rank matrix manifold, seen as a submanifold embedded in the space of matrices. In particular, we describe how metric projection can be used as retraction and how vector transport lets us obtain the conjugate search directions. Finally, we prove convergence of a regularized version of our algorithm under the assumption that the restricted isometry property holds for incoherent matrices throughout the iterations. The numerical experiments indicate that our approach scales very well for large-scale problems and compares favorably with the state-of-the-art, while outperforming most existing solvers.
Recommendations
- Guarantees of Riemannian optimization for low rank matrix completion
- A Riemannian rank-adaptive method for low-rank matrix completion
- Low-rank matrix completion via preconditioned optimization on the Grassmann manifold
- Robust low-rank matrix completion by Riemannian optimization
- Riemannian conjugate gradient method for low-rank tensor completion
Cited in
(only showing first 100 items - show all)- Two Newton methods on the manifold of fixed-rank matrices endowed with Riemannian quotient geometries
- An entropy-regularized ADMM for binary quadratic programming
- A framework of regularized low-rank matrix models for regression and classification
- A preconditioned Riemannian gradient descent algorithm for low-rank matrix recovery
- Tensor completion in hierarchical tensor representations
- Stability analysis of hierarchical tensor methods for time-dependent PDEs
- A new method based on the manifold-alternative approximating for low-rank matrix completion
- Geometric Methods on Low-Rank Matrix and Tensor Manifolds
- A hybrid Riemannian conjugate gradient method for nonconvex optimization problems
- A Riemannian gradient sampling algorithm for nonsmooth optimization on manifolds
- An extended Frank-Wolfe method with ``in-face directions, and its application to low-rank matrix completion
- Constructing low-rank Tucker tensor approximations using generalized completion
- Proximal gradient method for nonsmooth optimization over the Stiefel manifold
- Normal Cones Intersection Rule and Optimality Analysis for Low-Rank Matrix Optimization with Affine Manifolds
- Riemannian gradient methods for stochastic composition problems
- Low-rank retractions: a survey and new results
- A trust region method for solving multicriteria optimization problems on Riemannian manifolds
- A Riemannian rank-adaptive method for low-rank matrix completion
- Hermite interpolation with retractions on manifolds
- A Riemannian gossip approach to subspace learning on Grassmann manifold
- The Condition Number of Riemannian Approximation Problems
- Low-rank matrix iteration using polynomial-filtered subspace extraction
- Euclidean Representation of Low-Rank Matrices and Its Geometric Properties
- An alternative to EM for Gaussian mixture models: batch and stochastic Riemannian optimization
- Blind deconvolution by a steepest descent algorithm on a quotient manifold
- Robust principal component pursuit via inexact alternating minimization on matrix manifolds
- Matrix completion with sparse measurement errors
- Majorized proximal alternating imputation for regularized rank constrained matrix completion
- Iterative rank-one matrix completion via singular value decomposition and nuclear norm regularization
- Exact low-rank matrix completion from sparsely corrupted entries via adaptive outlier pursuit
- Finding stationary points on bounded-rank matrices: a geometric hurdle and a smooth remedy
- Recent Advances in Stochastic Riemannian Optimization
- Adaptive trust-region method on Riemannian manifold
- Tensor networks and hierarchical tensors for the solution of high-dimensional partial differential equations
- Empirical Bayes matrix completion
- A Riemannian conjugate gradient approach for solving the generalized eigenvalue problem with minimal perturbation
- Riemannian multigrid line search for low-rank problems
- A global exact penalty for rank-constrained optimization problem and applications
- Nonconvex weak sharp minima on Riemannian manifolds
- Nonsmooth optimization over the Stiefel manifold and beyond: proximal gradient method and recent variants
- Practical gradient and conjugate gradient methods on flag manifolds
- A Riemannian subspace limited-memory SR1 trust region method
- Painless breakups -- efficient demixing of low rank matrices
- Tracking and Regret Bounds for Online Zeroth-Order Euclidean and Riemannian Optimization
- The geometry of algorithms using hierarchical tensors
- Accelerated low rank matrix approximate algorithms for matrix completion
- Exact matrix completion based on low rank Hankel structure in the Fourier domain
- Sequential quadratic optimization for nonlinear optimization problems on Riemannian manifolds
- Fast and provable algorithms for spectrally sparse signal reconstruction via low-rank Hankel matrix completion
- Riemannian conjugate gradient descent method for fixed multi rank third-order tensor completion
- Low rank pure quaternion approximation for pure quaternion matrices
- Riemannian optimization via Frank-Wolfe methods
- Desingularization of bounded-rank matrix sets
- Guarantees of Riemannian optimization for low rank matrix completion
- A geometric approach to dynamical model order reduction
- Analysis of asymptotic escape of strict saddle sets in manifold optimization
- A Riemannian subspace BFGS trust region method
- Jacobi-Davidson method on low-rank matrix manifolds
- Hybrid Riemannian conjugate gradient methods with global convergence properties
- Low-rank tensor completion by Riemannian optimization
- Reconstruction of jointly sparse vectors via manifold optimization
- Low-rank optimization with trace norm penalty
- Intrinsic representation of tangent vectors and vector transports on matrix manifolds
- Nonlinear matrix recovery using optimization on the Grassmann manifold
- Hierarchical compressed sensing
- Sufficient descent Riemannian conjugate gradient methods
- Preconditioned low-rank Riemannian optimization for linear systems with tensor product structure
- Low-rank matrix completion via preconditioned optimization on the Grassmann manifold
- Low-rank Riemannian eigensolver for high-dimensional Hamiltonians
- -subgradient algorithms for locally Lipschitz functions on Riemannian manifolds
- Toeplitz matrix completion via smoothing augmented Lagrange multiplier algorithm
- Guarantees of Riemannian optimization for low rank matrix recovery
- Some empirical advances in matrix completion
- An image inpainting algorithm using exemplar matching and low-rank sparse prior
- A Riemannian Framework for Low-Rank Structured Elliptical Models
- A semi-smoothing augmented Lagrange multiplier algorithm for low-rank Toeplitz matrix completion
- Toeplitz matrix completion via a low-rank approximation algorithm
- Computing eigenspaces with low rank constraints
- The two-stage iteration algorithms based on the shortest distance for low-rank matrix completion
- Riemannian optimization for high-dimensional tensor completion
- Convergence results for projected line-search methods on varieties of low-rank matrices via Łojasiewicz inequality
- A quadratically convergent algorithm for structured low-rank approximation
- Sensitivity of low-rank matrix recovery
- A penalty method for rank minimization problems in symmetric matrices
- Automatic differentiation for Riemannian optimization on low-rank matrix and tensor-train manifolds
- The Extrinsic Geometry of Dynamical Systems Tracking Nonlinear Matrix Projections
- From low-rank retractions to dynamical low-rank approximation and back
- An improved Riemannian conjugate gradient method and its application to robust matrix completion
- Dynamically Orthogonal Runge–Kutta Schemes with Perturbative Retractions for the Dynamical Low-Rank Approximation
- Efficient Weingarten map and curvature estimation on manifolds
- Noisy matrix completion: understanding statistical guarantees for convex relaxation via nonconvex optimization
- Accurate and fast matrix factorization for low-rank learning.
- Proximal linearization methods for Schatten \(p\)-quasi-norm minimization
- Fast gradient method for low-rank matrix estimation
- Accelerated Alternating Projections for Robust Principal Component Analysis
- One-bit tensor completion via transformed tensor singular value decomposition
- Stochastic variance reduced gradient for affine rank minimization problem
- Riemannian preconditioning
- Alternating least squares as moving subspace correction
- Riemannian optimization for phase retrieval from masked Fourier measurements
This page was built for publication: Low-rank matrix completion by Riemannian optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2848192)