Global convergence of a BFGS-type algorithm for nonconvex multiobjective optimization problems
From MaRDI portal
Publication:6568922
Recommendations
- A globally convergent BFGS method with nonmonotone line search for non-convex minimization
- A modified BFGS method and its global convergence in nonconvex minimization
- Global convergence properties of the modified BFGS method associating with general line search model
- The global convergence of a modified BFGS method under inexact line search for nonconvex functions
- Global convergence of a modified limited memory BFGS method for non-convex minimization
Cites work
- scientific article; zbMATH DE number 3529352 (Why is no real title available?)
- scientific article; zbMATH DE number 5060482 (Why is no real title available?)
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- A Family of Variable-Metric Methods Derived by Variational Means
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- A Wolfe Line Search Algorithm for Vector Optimization
- A limited memory quasi-Newton approach for multi-objective optimization
- A modified BFGS method and its global convergence in nonconvex minimization
- A modified Quasi-Newton method for vector optimization problem
- A new approach to variable metric algorithms
- A perfect example for the BFGS method
- A projected gradient method for vector optimization problems
- A quadratically convergent Newton method for vector optimization
- A quasi-Newton method with Wolfe line searches for multiobjective optimization
- A steepest descent method for vector optimization
- A study of Liu-Storey conjugate gradient methods for vector optimization
- A superlinearly convergent nonmonotone quasi-Newton method for unconstrained multiobjective optimization
- Adaptive Scalarization Methods in Multiobjective Optimization
- Approximate proximal methods in vector optimization
- Benchmarking optimization software with performance profiles.
- Conditional gradient method for multiobjective optimization
- Conditional gradient method for vector optimization
- Conditioning of Quasi-Newton Methods for Function Minimization
- Convergence Properties of the BFGS Algoritm
- Convergence analysis of a nonmonotone projected gradient method for multiobjective optimization problems
- Convergence of the projected gradient method for quasiconvex multiobjective optimization
- Direct Multisearch for Multiobjective Optimization
- Extended Newton methods for multiobjective optimization: majorizing function technique and convergence analysis
- Generalized proximal method for efficient solutions in vector optimization
- Globally convergent Newton-type methods for multiobjective optimization
- Hybrid approximate proximal algorithms for efficient solutions in vector optimization
- Hybrid approximate proximal method with auxiliary variational inequality for vector optimization
- Inexact projected gradient method for vector optimization
- Newton's method for multiobjective optimization
- Newton-like methods for efficient solutions in vector optimization
- Nonlinear Conjugate Gradient Methods for Vector Optimization
- Nonlinear multiobjective optimization
- Nonsmooth multiobjective programming with quasi-Newton methods
- On the convergence of the projected gradient method for vector optimization
- On the extension of the Hager-Zhang conjugate gradient method for vector optimization
- On the global convergence of the BFGS method for nonconvex unconstrained optimization problems
- Practical augmented Lagrangian methods for constrained optimization
- Proximal Methods in Vector Optimization
- Quasi-Newton methods for multiobjective optimization problems
- Quasi-Newton methods for solving multiobjective optimization
- Quasi-Newton's method for multiobjective optimization
- Steepest descent methods for multicriteria optimization.
- The BFGS method with exact line searches fails for non-convex objective functions
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- The multiobjective steepest descent direction is not Lipschitz continuous, but is Hölder continuous
This page was built for publication: Global convergence of a BFGS-type algorithm for nonconvex multiobjective optimization problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6568922)