Discrete gradient method: Derivative-free method for nonsmooth optimization (Q946181)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Discrete gradient method: Derivative-free method for nonsmooth optimization
scientific article

    Statements

    Discrete gradient method: Derivative-free method for nonsmooth optimization (English)
    0 references
    0 references
    22 September 2008
    0 references
    The paper of A. M. Bagirov, B. Karasözen and M. Sezer is a valuable contribution to the numerical-algorithmical part of unconstrained continuous optimization, in fact, to its nonsmooth version, and the corresponding theory is provided, too. It stands in the tradition of nonsmooth analysis and optimization, but it benefits from some insight and experience related with important classes of modern applications. These are hosted in the fields of data mining, especially, in clustering theory. In fact, there, the classical rules of nonsmooth calculus are not all fully applicable, especially, due the lack of the so-called regularity condition, such that some classical or regularity based nonsmooth results and algorithms can fail. For this reason, A. M. Bagirov with his colleagues had proposed and developed the concept and use of discrete gradients as an alternative in nonsmooth optimization. In fact, here, the corresponding subgradients are approximated by discrete gradients; their use lets the authors' approach be one of derivative free optimization. The authors demonstrate that this approximation can be made for a broad class of functions, and that their discrete gradient method (DGM) can be applied to find descent directions. They provide numerical experiments and compare their results with the nonsmooth optimization solver DNLP (from CONTOPT-GAMS) and the derivative-free optimization solver CONDOR. The paper is well organized by firstly introducing into classes of functions such as locally Lipschitzian, semismooth and quasidifferentiable, by discussing the clustering problem and nonregularity, presenting the approximation of subgradients and computing subdifferenials via discrete gradients, computing descent directions, presenting DGM and, then, giving the numerical experience for (mostly) nonconvex problems, before concluding that, in fact, DGM delivers, very good results within that comparison. Taking into account how important these data mining applications are, but also that nonsmooth optimization is more and more important in other fields such as finance and engineering, and that the unconstrained case could become treated, too, this conclusion is promising indeed.
    0 references
    0 references
    nonsmooth optimization
    0 references
    derivative-free optimization
    0 references
    subdifferentials
    0 references
    discrete gradients
    0 references
    0 references
    0 references
    0 references
    0 references