Iteration-complexity and asymptotic analysis of steepest descent method for multiobjective optimization on Riemannian manifolds
From MaRDI portal
Publication:2302754
Abstract: The steepest descent method for multiobjective optimization on Riemannian manifolds with lower bounded sectional curvature is analyzed in this paper. The aim of the paper is twofold. Firstly, an asymptotic analysis of the method is presented with three different finite procedures for determining the stepsize, namely, Lipschitz stepsize, adaptive stepsize and Armijo-type stepsize. The second aim is to present, by assuming that the Jacobian of the objective function is componentwise Lipschitz continuous, iteration-complexity bounds for the method with these three stepsizes strategies. In addition, some examples are presented to emphasize the importance of working in this new context. Numerical experiments are provided to illustrate the effectiveness of the method in this new setting and certify the obtained theoretical results.
Recommendations
- Convergence of inexact steepest descent algorithm for multiobjective optimizations on Riemannian manifolds without curvature constraints
- An inexact steepest descent method for multicriteria optimization on Riemannian manifolds
- Unconstrained steepest descent method for multicriteria optimization on Riemannian manifolds
- A subgradient method for multiobjective optimization on Riemannian manifolds
- Iteration-complexity of the subgradient method on Riemannian manifolds with lower bounded curvature
Cites work
- scientific article; zbMATH DE number 6378042 (Why is no real title available?)
- scientific article; zbMATH DE number 1246686 (Why is no real title available?)
- scientific article; zbMATH DE number 1282147 (Why is no real title available?)
- scientific article; zbMATH DE number 681023 (Why is no real title available?)
- scientific article; zbMATH DE number 909255 (Why is no real title available?)
- scientific article; zbMATH DE number 5223994 (Why is no real title available?)
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A Wolfe Line Search Algorithm for Vector Optimization
- A framework for generalising the Newton method and other iterative methods from Euclidean space to manifolds
- A method for constrained multiobjective optimization based on SQP techniques
- A projected gradient method for vector optimization problems
- A steepest descent method for vector optimization
- A steepest descent-like method for variable order vector optimization problems
- A subgradient method for vector optimization problems
- A survey and comparison of contemporary algorithms for computing the matrix geometric mean
- An extragradient-type algorithm for variational inequality on Hadamard manifolds
- An inexact steepest descent method for multicriteria optimization on Riemannian manifolds
- Barzilai and Borwein's method for multiobjective optimization problems
- Complexity of gradient descent for multiobjective optimization
- Conic geometric optimization on the manifold of positive definite matrices
- Expokit
- Full convergence of the steepest descent method with inexact line searches
- Gradient method for optimization on Riemannian manifolds with lower bounded curvature
- Inexact projected gradient method for vector optimization
- Iteration-complexity of gradient, subgradient and proximal point methods on Riemannian manifolds
- Linear convergence of subgradient algorithm for convex feasibility on Riemannian manifolds
- Multiple subgradient descent bundle method for convex nonsmooth multiobjective optimization
- Newton's method for multiobjective optimization
- Nonlinear Conjugate Gradient Methods for Vector Optimization
- On the Riemannian geometry defined by self-concordant barriers and interior-point methods.
- On the convergence of the projected gradient method for vector optimization
- On the extension of the Hager-Zhang conjugate gradient method for vector optimization
- Optimization Techniques on Riemannian Manifolds
- Practical augmented Lagrangian methods for constrained optimization
- Proximal point algorithms on Hadamard manifolds: linear convergence and finite termination
- Riemannian geometry
- Smooth nonlinear optimization of \(\mathbb R^n\)
- Steepest descent methods for multicriteria optimization.
- Subgradient projection algorithms for convex feasibility on Riemannian manifolds with lower bounded curvatures
- The Geometry of Algorithms with Orthogonality Constraints
- The Gradient Projection Method Along Geodesics
- The proximal point method for locally Lipschitz functions in multiobjective optimization with application to the compromise problem
- Trust region globalization strategy for the nonconvex unconstrained multiobjective optimization problem
- Unconstrained steepest descent method for multicriteria optimization on Riemannian manifolds
- Variational inequalities for set-valued vector fields on Riemannian manifolds: convexity of the solution set and the proximal point algorithm
- Weak sharp minima on Riemannian manifolds
Cited in
(11)- Inexact proximal point algorithm for quasiconvex optimization problems on Hadamard manifolds
- An incremental descent method for multi-objective optimization
- On the convergence of steepest descent methods for multiobjective optimization
- Convergence of inexact steepest descent algorithm for multiobjective optimizations on Riemannian manifolds without curvature constraints
- On \(q\)-steepest descent method for unconstrained multiobjective optimization problems
- An inexact steepest descent method for multicriteria optimization on Riemannian manifolds
- Complexity bound of trust-region methods for convex smooth unconstrained multiobjective optimization
- Riemannian conjugate gradient methods: general framework and specific algorithms with convergence analyses
- A trust region method for solving multicriteria optimization problems on Riemannian manifolds
- A generalized geometric spectral conjugate gradient algorithm for finding zero of a monotone tangent vector field on a constant curvature Hadamard manifold
- Multiobjective BFGS method for optimization on Riemannian manifolds
This page was built for publication: Iteration-complexity and asymptotic analysis of steepest descent method for multiobjective optimization on Riemannian manifolds
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2302754)