A new steepest descent differential inclusion-based method for solving general nonsmooth convex optimization problems (Q2442700): Difference between revisions

From MaRDI portal
RedirectionBot (talk | contribs)
Removed claims
ReferenceBot (talk | contribs)
Changed an Item
 
(3 intermediate revisions by 3 users not shown)
Property / author
 
Property / author: Mohammed Hosseini Ali Abadi / rank
 
Normal rank
Property / reviewed by
 
Property / reviewed by: Maxim Ivanov Todorov / rank
 
Normal rank
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1007/s10957-012-0258-4 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W1991261176 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Nonsmooth Optimization Using Mordukhovich's Subdifferential / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5187067 / rank
 
Normal rank
Property / cites work
 
Property / cites work: The Cutting-Plane Method for Solving Convex Programs / rank
 
Normal rank
Property / cites work
 
Property / cites work: Decomposition and Nondifferentiable Optimization with the Projective Algorithm / rank
 
Normal rank
Property / cites work
 
Property / cites work: On approximations with finite precision in bundle methods for nonsmooth optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: New bundle methods for solving Lagrangian relaxation dual problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Generalized Neural Network for Nonsmooth Nonlinear Programming Problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Subgradient-based feedback neural networks for non-differentiable convex optimization problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Generalized gradient projection neural networks for nonsmooth optimization problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: New variants of bundle methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: Survey of Bundle Methods for Nonsmooth Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Methods of descent for nondifferentiable optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4004158 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3324260 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Semismooth and Semiconvex Functions in Constrained Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: A feasible semismooth asymptotically Newton method for mixed complementarity problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Optimization and nonsmooth analysis / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3994508 / rank
 
Normal rank
Property / cites work
 
Property / cites work: A differential inclusion-based approach for solving nonsmooth convex optimization problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Essentially smooth Lipschitz functions / rank
 
Normal rank

Latest revision as of 12:33, 7 July 2024

scientific article
Language Label Description Also known as
English
A new steepest descent differential inclusion-based method for solving general nonsmooth convex optimization problems
scientific article

    Statements

    A new steepest descent differential inclusion-based method for solving general nonsmooth convex optimization problems (English)
    0 references
    1 April 2014
    0 references
    The authors consider general nonsmooth convex optimization problems. From differential equation-based and differential inclusion-based methods for solving optimization problems, there are four different classes: gradient projection methods, primal-dual methods, Lagrange multiplier rule methods and penalty-based methods. The method investigated in the present work is a steepest descent neural network for solving general nonsmooth convex optimization problems. The authors prove analytically the convergence to the optimal solution set and apply the method to some numerical tests, confirming the effectiveness and the performance of the proposed neural network.
    0 references
    steepest descent neural network
    0 references
    differential inclusion-based methods
    0 references
    general nonsmooth convex optimization
    0 references
    convergence of trajectories
    0 references

    Identifiers