Discrete gradient method: Derivative-free method for nonsmooth optimization (Q946181): Difference between revisions

From MaRDI portal
Changed an Item
ReferenceBot (talk | contribs)
Changed an Item
 
(4 intermediate revisions by 3 users not shown)
Property / describes a project that uses
 
Property / describes a project that uses: GAMS / rank
 
Normal rank
Property / describes a project that uses
 
Property / describes a project that uses: GradSamp / rank
 
Normal rank
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2048153686 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Generalized Bundle Methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: A bundle type approach to the unconstrained minimization of convex nonsmooth functions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Methods of descent for nondifferentiable optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4132264 / rank
 
Normal rank
Property / cites work
 
Property / cites work: An Algorithm for Constrained Optimization with Semismooth Functions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4146571 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Algorithms for finite and semi-infinite Min-Max-Min problems using adaptive smoothing techniques / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Robust Gradient Sampling Algorithm for Nonsmooth, Nonconvex Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Analysis of Generalized Pattern Searches / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the Convergence of Pattern Search Algorithms / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3704489 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Semismooth and Semiconvex Functions in Constrained Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4875353 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Unsupervised and supervised data classification via nonsmooth and global optimization (with comments and rejoinder) / rank
 
Normal rank
Property / cites work
 
Property / cites work: A new nonsmooth optimization algorithm for minimum sum-of-squares clustering problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4946119 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Continuous subdifferential approximations and their applications / rank
 
Normal rank
Property / cites work
 
Property / cites work: Finding the nearest point in A polytope / rank
 
Normal rank
Property / cites work
 
Property / cites work: Solving semidefinite quadratic problems within nonsmooth optimization algorithms / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Dual Method for Certain Positive Semidefinite Quadratic Programming Problems / rank
 
Normal rank

Latest revision as of 17:29, 28 June 2024

scientific article
Language Label Description Also known as
English
Discrete gradient method: Derivative-free method for nonsmooth optimization
scientific article

    Statements

    Discrete gradient method: Derivative-free method for nonsmooth optimization (English)
    0 references
    0 references
    22 September 2008
    0 references
    The paper of A. M. Bagirov, B. Karasözen and M. Sezer is a valuable contribution to the numerical-algorithmical part of unconstrained continuous optimization, in fact, to its nonsmooth version, and the corresponding theory is provided, too. It stands in the tradition of nonsmooth analysis and optimization, but it benefits from some insight and experience related with important classes of modern applications. These are hosted in the fields of data mining, especially, in clustering theory. In fact, there, the classical rules of nonsmooth calculus are not all fully applicable, especially, due the lack of the so-called regularity condition, such that some classical or regularity based nonsmooth results and algorithms can fail. For this reason, A. M. Bagirov with his colleagues had proposed and developed the concept and use of discrete gradients as an alternative in nonsmooth optimization. In fact, here, the corresponding subgradients are approximated by discrete gradients; their use lets the authors' approach be one of derivative free optimization. The authors demonstrate that this approximation can be made for a broad class of functions, and that their discrete gradient method (DGM) can be applied to find descent directions. They provide numerical experiments and compare their results with the nonsmooth optimization solver DNLP (from CONTOPT-GAMS) and the derivative-free optimization solver CONDOR. The paper is well organized by firstly introducing into classes of functions such as locally Lipschitzian, semismooth and quasidifferentiable, by discussing the clustering problem and nonregularity, presenting the approximation of subgradients and computing subdifferenials via discrete gradients, computing descent directions, presenting DGM and, then, giving the numerical experience for (mostly) nonconvex problems, before concluding that, in fact, DGM delivers, very good results within that comparison. Taking into account how important these data mining applications are, but also that nonsmooth optimization is more and more important in other fields such as finance and engineering, and that the unconstrained case could become treated, too, this conclusion is promising indeed.
    0 references
    0 references
    nonsmooth optimization
    0 references
    derivative-free optimization
    0 references
    subdifferentials
    0 references
    discrete gradients
    0 references
    0 references
    0 references
    0 references
    0 references