Globally convergent Newton-type methods for multiobjective optimization
From MaRDI portal
Publication:2082544
DOI10.1007/s10589-022-00414-7zbMath1502.90165OpenAlexW4296182520MaRDI QIDQ2082544
F. S. Lima, L. F. Prudente, Max L. N. Gonçalves
Publication date: 4 October 2022
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10589-022-00414-7
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26) Multi-objective and goal programming (90C29) Newton-type methods (49M15)
Related Items (8)
An accelerated proximal gradient method for multiobjective optimization ⋮ Spectral conjugate gradient methods for vector optimization problems ⋮ Improved front steepest descent for multi-objective optimization ⋮ Adaptive sampling stochastic multigradient algorithm for stochastic multiobjective optimization ⋮ Multiobjective BFGS method for optimization on Riemannian manifolds ⋮ A memetic procedure for global multi-objective optimization ⋮ Memory gradient method for multiobjective optimization ⋮ A limited memory quasi-Newton approach for multi-objective optimization
Uses Software
Cites Work
- Unnamed Item
- Convergence of stochastic search algorithms to finite size Pareto set approximations
- Steepest descent methods for multicriteria optimization.
- The BFGS method with exact line searches fails for non-convex objective functions
- Nonmonotone gradient methods for vector optimization with a portfolio optimization application
- A steepest descent method for vector optimization
- A projected gradient method for vector optimization problems
- A perfect example for the BFGS method
- Conditional gradient method for multiobjective optimization
- On the extension of the Hager-Zhang conjugate gradient method for vector optimization
- Quasi-Newton's method for multiobjective optimization
- Nonmonotone line searches for unconstrained multiobjective optimization problems
- Convergence analysis of a nonmonotone projected gradient method for multiobjective optimization problems
- Inexact projected gradient method for vector optimization
- On the divergence of line search methods
- The multiobjective steepest descent direction is not Lipschitz continuous, but is Hölder continuous
- Box-constrained multi-objective optimization: A gradient-like method without ``a priori scalarization
- Proper efficiency and the theory of vector maximization
- Singular Continuation: Generating Piecewise Linear Approximations to Pareto Sets via Global Analysis
- Direct Multisearch for Multiobjective Optimization
- A modified Quasi-Newton method for vector optimization problem
- Newton's Method for Multiobjective Optimization
- LAPACK Users' Guide
- Testing Unconstrained Optimization Software
- Normal-Boundary Intersection: A New Method for Generating the Pareto Surface in Nonlinear Multicriteria Optimization Problems
- Nonlinear Conjugate Gradient Methods for Vector Optimization
- Convergence Properties of the BFGS Algoritm
- A Wolfe Line Search Algorithm for Vector Optimization
- A quadratically convergent Newton method for vector optimization
- Extended Newton Methods for Multiobjective Optimization: Majorizing Function Technique and Convergence Analysis
- Proximal Methods in Vector Optimization
- Practical Augmented Lagrangian Methods for Constrained Optimization
- Benchmarking optimization software with performance profiles.
- Generalized homotopy approach to multiobjective optimization.
This page was built for publication: Globally convergent Newton-type methods for multiobjective optimization