Incremental gradient-free method for nonsmooth distributed optimization (Q2411165): Difference between revisions

From MaRDI portal
Importer (talk | contribs)
Created a new Item
 
ReferenceBot (talk | contribs)
Changed an Item
 
(5 intermediate revisions by 4 users not shown)
Property / author
 
Property / author: Zhi-You Wu / rank
Normal rank
 
Property / author
 
Property / author: Zhi-You Wu / rank
 
Normal rank
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.3934/jimo.2017021 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2560284529 / rank
 
Normal rank
Property / cites work
 
Property / cites work: A derivative-free method for linearly constrained nonsmooth optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Stochastic optimization problems with nondifferentiable cost functionals / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4001523 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4830373 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Incremental proximal methods for large scale convex optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Introduction to Derivative-Free Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Dual Averaging for Distributed Optimization: Convergence Analysis and Network Scaling / rank
 
Normal rank
Property / cites work
 
Property / cites work: Randomized Smoothing for Stochastic Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: A smoothing scheme for optimization problems with max-min constraints / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Randomized Incremental Subgradient Method for Distributed Optimization in Networked Systems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convergence of Approximate and Incremental Subgradient Methods for Convex Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Gradient-free method for nonsmooth distributed optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: DISTRIBUTED PROXIMAL-GRADIENT METHOD FOR CONVEX OPTIMIZATION WITH INEQUALITY CONSTRAINTS / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q2752037 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Incremental Subgradient Methods for Nondifferentiable Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Distributed Subgradient Methods for Multi-Agent Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Random gradient-free minimization of convex functions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Robust identification / rank
 
Normal rank
Property / cites work
 
Property / cites work: Incremental Stochastic Subgradient Algorithms for Convex Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Normalized Incremental Subgradient Algorithm and Its Application / rank
 
Normal rank
Property / cites work
 
Property / cites work: Maximum flow problem in the distribution network / rank
 
Normal rank
Property / cites work
 
Property / cites work: Incremental gradient algorithms with stepsizes bounded away from zero / rank
 
Normal rank
Property / cites work
 
Property / cites work: Gradient‐free method for distributed multi‐agent optimization via push‐sum algorithms / rank
 
Normal rank
Property / cites work
 
Property / cites work: A hybrid method combining genetic algorithm and Hooke-Jeeves method for constrained global optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: A derivative-free method for solving large-scale nonlinear systems of equations / rank
 
Normal rank
Property / cites work
 
Property / cites work: A new exact penalty function method for continuous inequality constrained optimization problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: On stochastic gradient and subgradient methods with adaptive steplength sequences / rank
 
Normal rank
Property / cites work
 
Property / cites work: A fast dual proximal-gradient method for separable convex optimization with linear coupled constraints / rank
 
Normal rank
links / mardi / namelinks / mardi / name
 

Latest revision as of 14:56, 14 July 2024

scientific article
Language Label Description Also known as
English
Incremental gradient-free method for nonsmooth distributed optimization
scientific article

    Statements

    Incremental gradient-free method for nonsmooth distributed optimization (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    20 October 2017
    0 references
    0 references
    incremental method
    0 references
    Gaussian smoothing
    0 references
    gradient-free method
    0 references
    convex optimization
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references