Gradient-free method for nonsmooth distributed optimization
From MaRDI portal
Publication:2018475
DOI10.1007/s10898-014-0174-2zbMath1341.90135OpenAlexW2042220749MaRDI QIDQ2018475
Jueyou Li, Changzhi Wu, Qiang Long, Zhi-You Wu
Publication date: 24 March 2015
Published in: Journal of Global Optimization (Search for Journal in Brave)
Full work available at URL: http://researchonline.federation.edu.au/vital/access/HandleResolver/1959.17/61584
Programming involving graphs or networks (90C35) Convex programming (90C25) Derivative-free methods and methods using generalized derivatives (90C56)
Related Items (17)
A fast dual proximal-gradient method for separable convex optimization with linear coupled constraints ⋮ Robust optimization of noisy blackbox problems using the mesh adaptive direct search algorithm ⋮ Gradient-free distributed optimization with exact convergence ⋮ Stochastic mirror descent method for distributed multi-agent optimization ⋮ Incremental gradient-free method for nonsmooth distributed optimization ⋮ Regularized dual gradient distributed method for constrained convex optimization over unbalanced directed graphs ⋮ Strong consistency of random gradient‐free algorithms for distributed optimization ⋮ A gradient‐free distributed optimization method for convex sum of nonconvex cost functions ⋮ Distributed optimization methods for nonconvex problems with inequality constraints over time-varying networks ⋮ Distributed subgradient method for multi-agent optimization with quantized communication ⋮ An objective penalty function-based method for inequality constrained minimization problem ⋮ Recent theoretical advances in decentralized distributed convex optimization ⋮ Distributed convex optimization with coupling constraints over time-varying directed graphs ⋮ Sparsity-promoting distributed charging control for plug-in electric vehicles over distribution networks ⋮ A fully distributed ADMM-based dispatch approach for virtual power plant problems ⋮ Splitting proximal with penalization schemes for additive convex hierarchical minimization problems ⋮ An inexact dual fast gradient-projection method for separable convex optimization with linear coupled constraints
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Primal-dual subgradient methods for convex problems
- Distributed stochastic subgradient projection algorithms for convex optimization
- Incremental proximal methods for large scale convex optimization
- Distributed average consensus with least-mean-square deviation
- Convergence rate for consensus with delays
- Robust identification
- Global optimization for molecular clusters using a new smoothing approach
- New horizons in sphere-packing theory, part II: Lattice-based derivative-free optimization via global surrogates
- Random gradient-free minimization of convex functions
- Piecewise partially separable functions and a derivative-free algorithm for large scale nonsmooth optimization
- Incremental Subgradient Methods for Nondifferentiable Optimization
- Randomized Smoothing for Stochastic Optimization
- A Randomized Incremental Subgradient Method for Distributed Optimization in Networked Systems
- Matrix Analysis
- Distributed asynchronous deterministic and stochastic gradient optimization algorithms
- Introduction to Stochastic Search and Optimization
- Distributed Subgradient Methods for Multi-Agent Optimization
- Constrained Consensus and Optimization in Multi-Agent Networks
- Dual Averaging for Distributed Optimization: Convergence Analysis and Network Scaling
This page was built for publication: Gradient-free method for nonsmooth distributed optimization