Convergence Results for Projected Line-Search Methods on Varieties of Low-Rank Matrices Via Łojasiewicz Inequality
DOI10.1137/140957822zbMath1355.65079arXiv1402.5284OpenAlexW1993468393MaRDI QIDQ2954392
Reinhold Schneider, André Uschmajew
Publication date: 13 January 2017
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1402.5284
convergence analysissteepest descenttangent conesRiemannian optimizationline search methodslow-rank matricesŁojasiewicz gradient inequality
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Programming in abstract spaces (90C48)
Related Items (54)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Tensor Decompositions and Applications
- Tensor-Train Decomposition
- The geometry of algorithms using hierarchical tensors
- Low-rank tensor completion by Riemannian optimization
- Greedy algorithms for high-dimensional eigenvalue problems
- The Łojasiewicz gradient inequality in the infinite-dimensional Hilbert space framework
- On the convergence of the proximal algorithm for nonsmooth functions involving analytic features
- Dynamical low-rank approximation: Applications and numerical experiments
- Numerical computation of an analytic singular value decomposition of a matrix valued function
- On gradients of functions definable in o-minimal structures
- Critical points of matrix least squares distance functions
- A nonlinear programming algorithm for solving semidefinite programs via low-rank factorization
- Iterative methods for low rank approximation of graph similarity matrices
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- Prox-regularity of rank constraint sets and implications for algorithms
- Convergence of non-smooth descent methods using the Kurdyka-Łojasiewicz inequality
- Fixed-rank matrix factorizations and Riemannian low-rank optimization
- A new scheme for the tensor representation
- Convergence to equilibrium for discretizations of gradient-like flows on Riemannian manifolds
- A projector-splitting integrator for dynamical low-rank approximation
- Convergence of gradient-based algorithms for the Hartree-Fock equations
- Low-Rank Matrix Completion by Riemannian Optimization
- Dynamical Approximation by Hierarchical Tucker and Tensor-Train Tensors
- A Block Coordinate Descent Method for Regularized Multiconvex Optimization with Applications to Nonnegative Tensor Factorization and Completion
- Projection-like Retractions on Matrix Manifolds
- On Convergence of the Maximum Block Improvement Method
- Dynamical Tensor Approximation
- A Riemannian Optimization Approach for Computing Low-Rank Solutions of Lyapunov Equations
- Low-Rank Optimization on the Cone of Positive Semidefinite Matrices
- Proximal Alternating Minimization and Projection Methods for Nonconvex Problems: An Approach Based on the Kurdyka-Łojasiewicz Inequality
- Clarke Subgradients of Stratifiable Functions
- Characterizations of Łojasiewicz inequalities: Subgradient flows, talweg, convexity
- Variational Analysis
- Limits of tangent spaces to real surfaces
- Low-Rank Optimization with Trace Norm Penalty
- The Łojasiewicz Inequality for Nonsmooth Subanalytic Functions with Applications to Subgradient Dynamical Systems
- Pointwise convergence of gradient‐like systems
- Dynamical Low‐Rank Approximation
- Convergence of the Iterates of Descent Methods for Analytic Cost Functions
- Generalized Kuhn–Tucker Conditions for Mathematical Programming Problems in a Banach Space
This page was built for publication: Convergence Results for Projected Line-Search Methods on Varieties of Low-Rank Matrices Via Łojasiewicz Inequality