Nonmonotone line search methods with variable sample size (Q2340358): Difference between revisions
From MaRDI portal
Removed claims |
Changed an Item |
||
Property / author | |||
Property / author: Nataša Krejić / rank | |||
Normal rank | |||
Property / author | |||
Property / author: Nataša Krklec Jerinkić / rank | |||
Normal rank |
Revision as of 18:39, 14 February 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Nonmonotone line search methods with variable sample size |
scientific article |
Statements
Nonmonotone line search methods with variable sample size (English)
0 references
16 April 2015
0 references
The paper deals with nonmonotone line search methods for unconstrained optimization. The objective function has the form of mathematical expectation and is approximated by the sample average approximation (SAA) with a large sample of a fixed size. As the function evaluation is expensive, general methods that start with a small sample and increase the sample size throughout the optimization process are usually considered. The aim is to ensure increasing precision during the optimization procedure regardless of the behavior of the objective function. In this paper, the authors introduce and analyze a class of algorithms that combines nonmonotone line search rules with a variable sample size strategy and extend the results developed by \textit{N. Krejić} and \textit{N. Krklec} [J. Comput. Appl. Math. 245, 213--231 (2013; Zbl 1262.65066)]. The sample size may oscillate in each iteration in accordance with the progress made in a decrease of the objective function and the precision measured by an approximate width of the confidence interval. The proposed methods result in approximate solutions for the SAA problem with significantly smaller computational costs than the classical SAA method. A complete algorithm is presented and global convergence results for general search directions are proven. The R-linear rate of convergence is obtained when the gradient of objective function is available and a descent search direction is used at every iteration. Extensive numerical experiments on academic optimization problems in noisy environment as well as on problems with real data are carried out.
0 references
unconstrained minimization
0 references
nonmonotone line search
0 references
sample average approximation
0 references
variable sample size
0 references
algorithms
0 references
global convergence
0 references
local convergence
0 references
numerical experiments
0 references