Globally convergent optimization algorithms on Riemannian manifolds: Uniform framework for unconstrained and constrained optimization
From MaRDI portal
Publication:933806
DOI10.1007/s10957-006-9081-0zbMath1153.90017OpenAlexW2064954628WikidataQ115382591 ScholiaQ115382591MaRDI QIDQ933806
Publication date: 25 July 2008
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10957-006-9081-0
Related Items
Riemannian optimization and multidisciplinary design optimization ⋮ Tensor methods for the Boltzmann-BGK equation ⋮ Adaptive Quadratically Regularized Newton Method for Riemannian Optimization ⋮ Equilibrium problems on Riemannian manifolds with applications ⋮ Two-sided Grassmann manifold algorithm for optimal ℋ2 model reduction ⋮ Convergence of inexact steepest descent algorithm for multiobjective optimizations on Riemannian manifolds without curvature constraints ⋮ Tukey’s Depth for Object Data ⋮ Optimization over geodesics for exact principal geodesic analysis ⋮ An unconstrained H 2 model order reduction optimisation algorithm based on the Stiefel manifold for bilinear systems ⋮ Convergence rate of descent method with new inexact line-search on Riemannian manifolds ⋮ Convergence Analysis of Gradient Algorithms on Riemannian Manifolds without Curvature Constraints and Application to Riemannian Mass ⋮ Legendre transform and applications to finite and infinite optimization ⋮ Subgradient projection algorithms for convex feasibility on Riemannian manifolds with lower bounded curvatures ⋮ A globally optimal tri-vector method to solve an ill-posed linear system ⋮ Optimization Methods on Riemannian Manifolds via Extremum Seeking Algorithms ⋮ A globally and quadratically convergent algorithm with efficient implementation for unconstrained optimization ⋮ Blind deconvolution by a Newton method on the non-unitary hypersphere ⋮ Global convergence of Riemannian line search methods with a Zhang-Hager-type condition ⋮ Quadratic optimization with orthogonality constraint: explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods ⋮ An optimal tri-vector iterative algorithm for solving ill-posed linear inverse problems
Cites Work
- Minimizing a differentiable function over a differential manifold
- Constrained optimization along geodesics
- Global optimization of univariate Lipschitz functions. I: Survey and properties
- Global optimization of univariate Lipschitz functions. II: New algorithms and computational comparison
- A class of polynomial variable metric algorithms for linear optimization
- Nonlinear coordinate representations of smooth optimization problems
- Optimization and dynamical systems
- Curves on $S^{n - 1} $ That Lead to Eigenvalues or Their Means of a Matrix
- The Gradient Projection Method Along Geodesics
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Globally convergent optimization algorithms on Riemannian manifolds: Uniform framework for unconstrained and constrained optimization