An Efficient Gauss--Newton Algorithm for Symmetric Low-Rank Product Matrix Approximations
DOI10.1137/140971464zbMath1321.65060OpenAlexW1060999305MaRDI QIDQ5502244
Xin Liu, ZaiWen Wen, Yin Zhang
Publication date: 18 August 2015
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/140971464
algorithmglobal convergencesingular value decompositionmatrix completionprincipal component analysisnumerical experimentsKrylov subspace methodseigenvalue decompositionGauss-Newton methods\(Q\)-linear convergence ratelow-rank product matrix approximation
Image analysis in multivariate analysis (62H35) Numerical solutions to overdetermined systems, pseudoinverses (65F20) Iterative numerical methods for linear systems (65F10) Matrix completion problems (15A83)
Related Items
Uses Software
Cites Work
- Finding structure with randomness: Probabilistic algorithms for constructing approximate matrix decompositions
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- JADAMILU: a software code for computing selected eigenvalues of large sparse symmetric matrices
- Simultaneous iteration for computing invariant subspaces of non-Hermitian matrices
- A Davidson program for finding a few selected extreme eigenpairs of a large, sparse, real, symmetric matrix
- Trace-penalty minimization for large-scale eigenspace computation
- Computational aspects of F. L. Bauer's simultaneous iteration method
- Simultaneous iteration method for symmetric matrices
- Exact matrix completion via convex optimization
- Toward the Optimal Preconditioned Eigensolver: Locally Optimal Block Preconditioned Conjugate Gradient Method
- Implicitly Restarted Arnoldi Methods and Subspace Iteration
- Limited Memory Block Krylov Subspace Optimization for Computing Dominant Singular Value Decompositions
- FEAST As A Subspace Iteration Eigensolver Accelerated By Approximate Spectral Projection
- Robust principal component analysis?
- A Singular Value Thresholding Algorithm for Matrix Completion
- A Filtered Lanczos Procedure for Extreme and Interior Eigenvalue Problems
- The Modified Gauss-Newton Method for the Fitting of Non-Linear Regression Functions by Least Squares
- Chebyshev Acceleration Techniques for Solving Nonsymmetric Eigenvalue Problems
- A Chebyshev–Davidson Algorithm for Large Symmetric Eigenproblems
- Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
- Two-Point Step Size Gradient Methods
- An Algorithm for Least-Squares Estimation of Nonlinear Parameters
- A Simultaneous Iteration Algorithm for Real Matrices
- ARPACK Users' Guide
- Block Algorithms with Augmented Rayleigh-Ritz Projections for Large-Scale Eigenpair Computation
- The Power of Convex Relaxation: Near-Optimal Matrix Completion
- A method for the solution of certain non-linear problems in least squares
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item