Iteration and evaluation complexity for the minimization of functions whose computation is intrinsically inexact
From MaRDI portal
Publication:5235099
DOI10.1090/mcom/3445zbMath1461.65130OpenAlexW2922889119WikidataQ122112889 ScholiaQ122112889MaRDI QIDQ5235099
Ernesto G. Birgin, Nataša Krejić, José Mario Martínez
Publication date: 7 October 2019
Published in: Mathematics of Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1090/mcom/3445
Numerical mathematical programming methods (65K05) Applications of mathematical programming (90C90) Nonlinear programming (90C30) Numerical optimization and variational techniques (65K10)
Related Items
Inexact restoration for derivative-free expensive function minimization and applications, Stochastic analysis of an adaptive cubic regularization method under inexact gradient evaluations and dynamic Hessian accuracy, Inexact restoration for minimization with inexact evaluation both of the objective function and the constraints, Inexact restoration with subsampled trust-region methods for finite-sum minimization, Non-monotone inexact restoration method for nonlinear programming, The impact of noise on evaluation complexity: the deterministic trust-region case, A stochastic first-order trust-region method with inexact restoration for finite-sum minimization, An inexact restoration-nonsmooth algorithm with variable accuracy for stochastic nonsmooth convex optimization problems in machine learning and stochastic linear complementarity problems
Cites Work
- Unnamed Item
- Unnamed Item
- On the convergence and worst-case complexity of trust-region and regularization methods for unconstrained optimization
- A trust region algorithm with a worst-case iteration complexity of \(\mathcal{O}(\epsilon ^{-3/2})\) for nonconvex optimization
- Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- Optimality functions in stochastic programming
- An inexact-restoration method for nonlinear bilevel programming problems
- Local convergence of an inexact-restoration method and numerical experiments
- Efficient sample sizes in stochastic nonlinear programming
- A new line search inexact restoration approach for nonlinear programming
- Foundations of bilevel programming
- Cubic-regularization counterpart of a variable-norm trust-region method for unconstrained minimization
- Inexact restoration for Euler discretization of box-constrained optimal control problems
- Inexact-restoration algorithm for constrained optimization
- Assessing the reliability of general-purpose inexact restoration methods
- A Newton-like method with mixed factorizations and cubic regularization for unconstrained minimization
- Euler discretization and inexact restoration for optimal control
- Cubic regularization of Newton method and its global performance
- Inexact Restoration approach for minimization with inexact evaluation of the objective function
- Evaluation Complexity for Nonlinear Constrained Optimization Using Unscaled KKT Conditions and High-Order Models
- Optimal Budget Allocation for Sample Average Approximation
- Inexact Restoration Method for Derivative-Free Optimization with Smooth Constraints
- On the Complexity of Steepest Descent, Newton's and Regularized Newton's Methods for Nonconvex Unconstrained Optimization Problems
- On Choosing Parameters in Retrospective-Approximation Algorithms for Stochastic Root Finding and Simulation Optimization
- Local Minimizers of Quadratic Functions on Euclidean Balls and Spheres
- Nonmonotone Spectral Projected Gradient Methods on Convex Sets
- On High-order Model Regularization for Constrained Optimization
- On the employment of inexact restoration for the minimization of functions whose evaluation is subject to errors
- Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy
- ARCq: a new adaptive regularization by cubics
- On Regularization and Active-set Methods with Complexity for Constrained Optimization
- A Globally Convergent Filter Method for Nonlinear Programming
- The Use of Quadratic Regularization with a Cubic Descent Condition for Unconstrained Optimization
- Mathematical Programs with Equilibrium Constraints
- Inexact-restoration method with Lagrangian tangent decrease and new merit function for nonlinear programming.