Gradient-free method for nonsmooth distributed optimization
From MaRDI portal
Publication:2018475
Recommendations
- Distributed subgradient-free stochastic optimization algorithm for nonsmooth convex functions over time-varying networks
- Gradient-free push-sum method for strongly convex distributed optimization
- Strong consistency of random gradient-free algorithms for distributed optimization
- Gradient-free distributed optimization with exact convergence
- Distributed quasi-monotone subgradient algorithm for nonsmooth convex optimization over directed graphs
Cites Work
- scientific article; zbMATH DE number 5454133 (Why is no real title available?)
- scientific article; zbMATH DE number 51132 (Why is no real title available?)
- A Randomized Incremental Subgradient Method for Distributed Optimization in Networked Systems
- Constrained Consensus and Optimization in Multi-Agent Networks
- Convergence rate for consensus with delays
- Distributed Subgradient Methods for Multi-Agent Optimization
- Distributed asynchronous deterministic and stochastic gradient optimization algorithms
- Distributed average consensus with least-mean-square deviation
- Distributed stochastic subgradient projection algorithms for convex optimization
- Dual Averaging for Distributed Optimization: Convergence Analysis and Network Scaling
- Dual averaging methods for regularized stochastic learning and online optimization
- Global optimization for molecular clusters using a new smoothing approach
- Incremental proximal methods for large scale convex optimization
- Incremental subgradient methods for nondifferentiable optimization
- Introduction to Stochastic Search and Optimization
- Matrix Analysis
- New horizons in sphere-packing theory, part II: Lattice-based derivative-free optimization via global surrogates
- Piecewise partially separable functions and a derivative-free algorithm for large scale nonsmooth optimization
- Primal-dual subgradient methods for convex problems
- Random gradient-free minimization of convex functions
- Randomized smoothing for stochastic optimization
- Robust identification
Cited In (21)
- Distributed convex optimization with coupling constraints over time-varying directed graphs
- An inexact dual fast gradient-projection method for separable convex optimization with linear coupled constraints
- Distributed subgradient method for multi-agent optimization with quantized communication
- Robust optimization of noisy blackbox problems using the mesh adaptive direct search algorithm
- Gradient-free push-sum method for strongly convex distributed optimization
- Splitting proximal with penalization schemes for additive convex hierarchical minimization problems
- Distributed subgradient-free stochastic optimization algorithm for nonsmooth convex functions over time-varying networks
- A fast dual proximal-gradient method for separable convex optimization with linear coupled constraints
- A gradient‐free distributed optimization method for convex sum of nonconvex cost functions
- Strong consistency of random gradient-free algorithms for distributed optimization
- Recent theoretical advances in decentralized distributed convex optimization
- Stochastic mirror descent method for distributed multi-agent optimization
- Gradient‐free method for distributed multi‐agent optimization via push‐sum algorithms
- Regularized dual gradient distributed method for constrained convex optimization over unbalanced directed graphs
- Harnessing Smoothness to Accelerate Distributed Optimization
- A fully distributed ADMM-based dispatch approach for virtual power plant problems
- Sparsity-promoting distributed charging control for plug-in electric vehicles over distribution networks
- Incremental gradient-free method for nonsmooth distributed optimization
- Distributed optimization methods for nonconvex problems with inequality constraints over time-varying networks
- Gradient-free distributed optimization with exact convergence
- An objective penalty function-based method for inequality constrained minimization problem
This page was built for publication: Gradient-free method for nonsmooth distributed optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2018475)