An overview of nonlinear optimization
DOI10.4171/ICM2022/72OpenAlexW4389775226MaRDI QIDQ6200213FDOQ6200213
Publication date: 22 March 2024
Published in: International Congress of Mathematicians (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.4171/icm2022/72
constrained optimizationunconstrained optimizationconjugate gradient methodgradient methodinterior-point methodsequential quadratic programming methodaugmented Lagrangian method of multipliersoptimization with least constraint violation
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- On the implementation of an interior-point filter line-search algorithm for large-scale nonlinear programming
- Algorithm 851
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- Variable Metric Method for Minimization
- A Rapidly Convergent Descent Method for Minimization
- Nonmonotone Spectral Projected Gradient Methods on Convex Sets
- Function minimization by conjugate gradients
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- Two-Point Step Size Gradient Methods
- On the Convergence of a New Conjugate Gradient Algorithm
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Multiplier and gradient methods
- Efficient hybrid conjugate gradient techniques
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- A Nonmonotone Line Search Technique for Newton’s Method
- Minimization of functions having Lipschitz continuous first partial derivatives
- Nonlinear programming without a penalty function.
- Nonlinear programming without a penalty function or a filter
- Analysis of monotone gradient methods
- A surperlinearly convergent algorithm for constrained optimization problems
- On a successive transformation of probability distribution and its application to the analysis of the optimum gradient method
- On the superlinear local convergence of a filter-SQP method
- Convergence Properties of Nonlinear Conjugate Gradient Methods
- An Interior Point Algorithm for Large-Scale Nonlinear Programming
- A Subspace Study on Conjugate Gradient Algorithms
- An efficient hybrid conjugate gradient method for unconstrained optimization
- A dual approach to solving nonlinear programming problems by unconstrained optimization
- Gradient methods with adaptive step-sizes
- New algorithms for singly linearly constrained quadratic programs subject to lower and upper bounds
- \(R\)-linear convergence of the Barzilai and Borwein gradient method
- Infeasibility Detection and SQP Methods for Nonlinear Optimization
- On the Barzilai and Borwein choice of steplength for the gradient method
- The Limited Memory Conjugate Gradient Method
- On the asymptotic behaviour of some new gradient methods
- On the asymptotic directions of the s-dimensional optimum gradient method
- A Barzilai-Borwein conjugate gradient method
- A nonmonotone conjugate gradient algorithm for unconstrained optimization
- An interior-point algorithm for nonconvex nonlinear programming
- The watchdog technique for forcing convergence in algorithms for constrained optimization
- Failure of global convergence for a class of interior point methods for nonlinear programming
- A new technique for inconsistent QP problems in the SQP method
- A penalty-interior-point algorithm for nonlinear constrained optimization
- A Sequential Quadratic Optimization Algorithm with Rapid Infeasibility Detection
- A Robust Algorithm for Optimization with General Equality and Inequality Constraints
- A successive quadratic programming algorithm with global and superlinear convergence properties
- A null-space primal-dual interior-point algorithm for nonlinear optimization with nice convergence properties
- A primal-dual interior-point method capable of rapidly detecting infeasibility for nonlinear programs
- An improved Dai-Kou conjugate gradient algorithm for unconstrained optimization
- A globally convergent primal-dual interior-point relaxation method for nonlinear programs
- Convergence of DFP algorithm
- A perfect example for the BFGS method
- Equipping the Barzilai--Borwein Method with the Two Dimensional Quadratic Termination Property
- Greedy Quasi-Newton Methods with Explicit Superlinear Convergence
- A penalty-free method with superlinear convergence for equality constrained optimization
- On the Least Q-order of Convergence of Variable Metric Algorithms
- Implementation of a robust SQP algorithm
This page was built for publication: An overview of nonlinear optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6200213)