First-order methods of smooth convex optimization with inexact oracle (Q403634): Difference between revisions
From MaRDI portal
Latest revision as of 22:53, 8 July 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | First-order methods of smooth convex optimization with inexact oracle |
scientific article |
Statements
First-order methods of smooth convex optimization with inexact oracle (English)
0 references
29 August 2014
0 references
A new notion of an inexact first-order oracle for a convex function on a closed convex set in a finite-dimensional space is introduced. Various properties of the oracle are derived and several examples for it are given. In particular it is shown that the oracle is applicable to the objective function of smooth convex optimization problems in which the objective function is defined by an another optimization problem as this is the case, for example, in connection with smoothing techniques, Moreau-Yosida regularization, and augmented Lagrangians. Next, efficiency estimates are derived for the classical primal and dual gradient method (CGMs) and for fast gradient methods (FGMs) when the objective function in the smooth convex optimization problem is equipped with the inexact oracle. In particular, the relation between the desired accuracy of the objective function and the needed accuracy of the oracle is investigated. Result of the study is that, supplied with an inexact first-order oracle, FGMs are not necessarily superior to CGMs anymore since they suffer more from error accumulations as is shown. The new oracle also is compared with other recently given inexact oracles. At last, it is shown that non-smooth and weakly smooth convex problems can be solved by first-order methods for smooth convex optimization when these are furnished with the new first-order oracle.
0 references
smooth convex optimization
0 references
large-scale optimization
0 references
first-order methods
0 references
inexact oracle
0 references
gradient methods
0 references
fast gradient methods
0 references
complexity bounds
0 references