Steepest descent methods for critical points in vector optimization problems
From MaRDI portal
Publication:3143359
Recommendations
- A steepest descent method for vector optimization
- Steepest descent methods for multicriteria optimization.
- A steepest descent-like method for variable order vector optimization problems
- An interior proximal method in vector optimization
- A steepest descent-like method for vector optimization problems with variable domination structure
Cites work
- A projected gradient method for vector optimization problems
- A steepest descent method for vector optimization
- Approximate proximal methods in vector optimization
- Box-constrained multi-objective optimization: A gradient-like method without ``a priori scalarization
- Critical points index for vector functions and vector optimization
- Degrees of Efficiency and Degrees of Minimality
- Exchange processes with price adjustment
- Full convergence of the steepest descent method with inexact line searches
- Generalized deviations in risk analysis
- Global analysis and economics. II: Extension of a theorem of Debreu
- Global analysis and economics. III: Pareto Optima and price equilibria
- Global analysis and economics. IV: Finiteness and stability of equilibria with general consumption sets and production
- Hybrid approximate proximal method with auxiliary variational inequality for vector optimization
- Invex functions and generalized convexity in multiobjective programming
- Newton's method for multiobjective optimization
- On the choice of parameters for the weighting method in vector optimization
- Proximal Methods in Vector Optimization
- Slow solutions of a differential inclusion and vector optimization
- Steepest descent methods for multicriteria optimization.
- Tangent Cones, Generalized Gradients and Mathematical Programming in Banach Spaces
Cited in
(22)
- Critical points index for vector functions and vector optimization
- On \(q\)-steepest descent method for unconstrained multiobjective optimization problems
- Conditional gradient method for vector optimization
- A steepest descent method for set optimization problems with set-valued mappings of finite cardinality
- A steepest descent-like method for variable order vector optimization problems
- Tikhonov-type regularization method for efficient solutions in vector optimization
- A study of Liu-Storey conjugate gradient methods for vector optimization
- On the extension of the Hager-Zhang conjugate gradient method for vector optimization
- A steepest descent-like method for vector optimization problems with variable domination structure
- Newton-like methods for efficient solutions in vector optimization
- Title not available (Why is no real title available?)
- Nonsmooth steepest descent method by proximal subdifferentials in Hilbert spaces
- Relaxed steepest descent and Cauchy-Barzilai-Borwein method
- Steepest descent methods for multicriteria optimization.
- Newton-like methods for solving vector optimization problems
- Overlooked Branch Cut in Steepest Descent Method: Switching Line and Atomic Domain
- A modified Quasi-Newton method for vector optimization problem
- A steepest descent method for vector optimization
- Title not available (Why is no real title available?)
- A PRP type conjugate gradient method without truncation for nonconvex vector optimization
- Steepest descent methods with generalized distances for constrained optimization
- Nonlinear Conjugate Gradient Methods for Vector Optimization
This page was built for publication: Steepest descent methods for critical points in vector optimization problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3143359)