An overview of nonlinear optimization
From MaRDI portal
Publication:6200213
DOI10.4171/icm2022/72OpenAlexW4389775226MaRDI QIDQ6200213
Publication date: 22 March 2024
Published in: International Congress of Mathematicians (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.4171/icm2022/72
unconstrained optimizationsequential quadratic programming methodinterior-point methodconstrained optimizationconjugate gradient methodgradient methodaugmented Lagrangian method of multipliersoptimization with least constraint violation
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30)
Cites Work
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- A Barzilai-Borwein conjugate gradient method
- A nonmonotone conjugate gradient algorithm for unconstrained optimization
- On a successive transformation of probability distribution and its application to the analysis of the optimum gradient method
- A primal-dual interior-point method capable of rapidly detecting infeasibility for nonlinear programs
- Nonlinear programming without a penalty function or a filter
- Gradient methods with adaptive step-sizes
- Efficient hybrid conjugate gradient techniques
- An interior-point algorithm for nonconvex nonlinear programming
- A new technique for inconsistent QP problems in the SQP method
- Failure of global convergence for a class of interior point methods for nonlinear programming
- Analysis of monotone gradient methods
- On the superlinear local convergence of a filter-SQP method
- Convergence of DFP algorithm
- A perfect example for the BFGS method
- A null-space primal-dual interior-point algorithm for nonlinear optimization with nice convergence properties
- A penalty-free method with superlinear convergence for equality constrained optimization
- An improved Dai-Kou conjugate gradient algorithm for unconstrained optimization
- A penalty-interior-point algorithm for nonlinear constrained optimization
- On the asymptotic behaviour of some new gradient methods
- On the implementation of an interior-point filter line-search algorithm for large-scale nonlinear programming
- New algorithms for singly linearly constrained quadratic programs subject to lower and upper bounds
- On the asymptotic directions of the s-dimensional optimum gradient method
- Multiplier and gradient methods
- Minimization of functions having Lipschitz continuous first partial derivatives
- R-linear convergence of the Barzilai and Borwein gradient method
- Infeasibility Detection and SQP Methods for Nonlinear Optimization
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- A Sequential Quadratic Optimization Algorithm with Rapid Infeasibility Detection
- On the Least Q-order of Convergence of Variable Metric Algorithms
- Algorithm 851
- Implementation of a robust SQP algorithm
- A successive quadratic programming algorithm with global and superlinear convergence properties
- Two-Point Step Size Gradient Methods
- A surperlinearly convergent algorithm for constrained optimization problems
- The watchdog technique for forcing convergence in algorithms for constrained optimization
- Variable Metric Method for Minimization
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- On the Convergence of a New Conjugate Gradient Algorithm
- Nonmonotone Spectral Projected Gradient Methods on Convex Sets
- A Robust Algorithm for Optimization with General Equality and Inequality Constraints
- Convergence Properties of Nonlinear Conjugate Gradient Methods
- An Interior Point Algorithm for Large-Scale Nonlinear Programming
- A Nonmonotone Line Search Technique for Newton’s Method
- A dual approach to solving nonlinear programming problems by unconstrained optimization
- A Subspace Study on Conjugate Gradient Algorithms
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- Equipping the Barzilai--Borwein Method with the Two Dimensional Quadratic Termination Property
- A globally convergent primal-dual interior-point relaxation method for nonlinear programs
- On the Barzilai and Borwein choice of steplength for the gradient method
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- A Rapidly Convergent Descent Method for Minimization
- Function minimization by conjugate gradients
- The Limited Memory Conjugate Gradient Method
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- Greedy Quasi-Newton Methods with Explicit Superlinear Convergence
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Nonlinear programming without a penalty function.
- An efficient hybrid conjugate gradient method for unconstrained optimization
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item