Global convergence rate analysis of unconstrained optimization methods based on probabilistic models (Q1646566)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Global convergence rate analysis of unconstrained optimization methods based on probabilistic models
scientific article

    Statements

    Global convergence rate analysis of unconstrained optimization methods based on probabilistic models (English)
    0 references
    25 June 2018
    0 references
    This paper studies a class of methods which have the best properties of deterministic algorithms and additionally relax the assumption that the derivative/model error is bounded in deterministic manner. In particular, the authors analyze global convergence rates for a line search method based on random models, for the cases of general nonconvex, convex and strongly convex functions. It is shown that in terms of the order of the accuracy, the evaluation complexity of such a method is the same as its counterparts that use deterministic accurate models; the use of probabilistic models only increases the complexity by a constant, which depends on the probability of the models being good. A second-order method is studied, too -- an adaptive probabilistic cubic regularization method. It is shown that this method has improved complexity bounds compared to probabilistic first-order methods and that these bounds are of the same optimal order as for the deterministic case.
    0 references
    line search methods
    0 references
    cubic regularization methods
    0 references
    random models
    0 references
    global convergence analysis
    0 references
    0 references
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references