scientific article; zbMATH DE number 7626764
From MaRDI portal
Publication:5053271
Authors:
Publication date: 6 December 2022
Full work available at URL: https://jmlr.csail.mit.edu/papers/v22/20-033.html
Title of this publication is not available (Why is that?)
Recommendations
- Riemannian conjugate gradient methods for computing the extreme eigenvalues of symmetric tensors
- Riemannian Newton method for the multivariate eigenvalue problem
- A Riemannian optimization approach to the matrix singular value decomposition
- A Riemannian Newton algorithm for nonlinear eigenvalue problems
- On computing the eigenvectors of a class of structured matrices
- scientific article; zbMATH DE number 1748566
- The geometry of matrix eigenvalue methods
- scientific article; zbMATH DE number 782060
- Computing eigenspaces with low rank constraints
- Riemannian multigrid line search for low-rank problems
optimal convergence rategeneralized eigenvalue problemRiemannian optimizationeigenvector computationshift-and-invert preconditioning
Cites Work
- A feasible method for optimization with orthogonality constraints
- Matrix completion and low-rank SVD via fast alternating least squares
- Introductory lectures on convex optimization. A basic course.
- Title not available (Why is that?)
- Title not available (Why is that?)
- RELATIONS BETWEEN TWO SETS OF VARIATES
- Finding structure with randomness: probabilistic algorithms for constructing approximate matrix decompositions
- Two-Point Step Size Gradient Methods
- Numerical methods for large eigenvalue problems
- Title not available (Why is that?)
- Efficiency of coordinate descent methods on huge-scale optimization problems
- Optimization and dynamical systems
- Title not available (Why is that?)
- The Matrix Eigenvalue Problem
- Title not available (Why is that?)
- Convex optimization: algorithms and complexity
- Stochastic Gradient Descent on Riemannian Manifolds
- Self-consistent-field calculations using Chebyshev-filtered subspace iteration
- A Linearly Convergent Variant of the Conditional Gradient Algorithm under Strong Convexity, with Applications to Online and Stochastic Optimization
- Title not available (Why is that?)
- The Riemannian Barzilai–Borwein method with nonmonotone line search and the matrix geometric mean computation
- Quadratic optimization with orthogonality constraint: explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods
- Efficiency of the Accelerated Coordinate Descent Method on Structured Optimization Problems
- Title not available (Why is that?)
This page was built for publication:
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5053271)