Global convergence of a BFGS-type algorithm for nonconvex multiobjective optimization problems
DOI10.1007/S10589-024-00571-XMaRDI QIDQ6568922FDOQ6568922
Authors: L. F. Prudente, D. R. Souza
Publication date: 8 July 2024
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Recommendations
- A globally convergent BFGS method with nonmonotone line search for non-convex minimization
- A modified BFGS method and its global convergence in nonconvex minimization
- Global convergence properties of the modified BFGS method associating with general line search model
- The global convergence of a modified BFGS method under inexact line search for nonconvex functions
- Global convergence of a modified limited memory BFGS method for non-convex minimization
rate of convergenceglobal convergencemultiobjective optimizationPareto optimalityquasi-Newton methodsBFGSWolfe line search
Numerical mathematical programming methods (65K05) Multi-objective and goal programming (90C29) Nonlinear programming (90C30) Methods of quasi-Newton type (90C53) Newton-type methods (49M15)
Cites Work
- Practical augmented Lagrangian methods for constrained optimization
- Title not available (Why is that?)
- Benchmarking optimization software with performance profiles.
- A Family of Variable-Metric Methods Derived by Variational Means
- A new approach to variable metric algorithms
- Conditioning of Quasi-Newton Methods for Function Minimization
- Title not available (Why is that?)
- Nonlinear multiobjective optimization
- Approximate proximal methods in vector optimization
- Adaptive Scalarization Methods in Multiobjective Optimization
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- A modified BFGS method and its global convergence in nonconvex minimization
- The BFGS method with exact line searches fails for non-convex objective functions
- On the global convergence of the BFGS method for nonconvex unconstrained optimization problems
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- Convergence Properties of the BFGS Algoritm
- Proximal Methods in Vector Optimization
- Direct Multisearch for Multiobjective Optimization
- Steepest descent methods for multicriteria optimization.
- A steepest descent method for vector optimization
- A projected gradient method for vector optimization problems
- Inexact projected gradient method for vector optimization
- On the convergence of the projected gradient method for vector optimization
- Newton's method for multiobjective optimization
- A quadratically convergent Newton method for vector optimization
- Nonsmooth multiobjective programming with quasi-Newton methods
- Quasi-Newton methods for solving multiobjective optimization
- Convergence of the projected gradient method for quasiconvex multiobjective optimization
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- Hybrid approximate proximal method with auxiliary variational inequality for vector optimization
- Hybrid approximate proximal algorithms for efficient solutions in vector optimization
- Generalized proximal method for efficient solutions in vector optimization
- Quasi-Newton's method for multiobjective optimization
- Quasi-Newton methods for multiobjective optimization problems
- A superlinearly convergent nonmonotone quasi-Newton method for unconstrained multiobjective optimization
- A modified Quasi-Newton method for vector optimization problem
- Newton-like methods for efficient solutions in vector optimization
- A Wolfe Line Search Algorithm for Vector Optimization
- Conditional gradient method for vector optimization
- A perfect example for the BFGS method
- Conditional gradient method for multiobjective optimization
- On the extension of the Hager-Zhang conjugate gradient method for vector optimization
- Nonlinear Conjugate Gradient Methods for Vector Optimization
- Extended Newton methods for multiobjective optimization: majorizing function technique and convergence analysis
- Convergence analysis of a nonmonotone projected gradient method for multiobjective optimization problems
- Globally convergent Newton-type methods for multiobjective optimization
- The multiobjective steepest descent direction is not Lipschitz continuous, but is Hölder continuous
- A study of Liu-Storey conjugate gradient methods for vector optimization
- A quasi-Newton method with Wolfe line searches for multiobjective optimization
- A limited memory quasi-Newton approach for multi-objective optimization
This page was built for publication: Global convergence of a BFGS-type algorithm for nonconvex multiobjective optimization problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6568922)