Asynchronous gossip-based gradient-free method for multiagent optimization
From MaRDI portal
Publication:1724545
DOI10.1155/2014/618641zbMath1474.90429OpenAlexW1984880322WikidataQ59040022 ScholiaQ59040022MaRDI QIDQ1724545
Publication date: 14 February 2019
Published in: Abstract and Applied Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1155/2014/618641
Multi-objective and goal programming (90C29) Approximation methods and heuristics in mathematical programming (90C59)
Cites Work
- Unnamed Item
- Distributed average consensus via gossip algorithm with real-valued and quantized data for \(0<q<1\)
- Distributed stochastic subgradient projection algorithms for convex optimization
- Distributed dual averaging method for multi-agent optimization with quantized communication
- Random gradient-free minimization of convex functions
- Fast linear iterations for distributed averaging
- Asynchronous Gossip-Based Random Projection Algorithms Over Networks
- Introduction to Derivative-Free Optimization
- Distributed Subgradient Methods for Multi-Agent Optimization
- Constrained Consensus and Optimization in Multi-Agent Networks
- Gradient‐free method for distributed multi‐agent optimization via push‐sum algorithms
- Consensus Problems in Networks of Agents With Switching Topology and Time-Delays
- Consensus seeking in multiagent systems under dynamically changing interaction topologies
- Asynchronous Broadcast-Based Convex Optimization Over a Network
- Gossip Algorithms for Convex Consensus Optimization Over Networks
- On Distributed Convex Optimization Under Inequality and Equality Constraints
- Dual Averaging for Distributed Optimization: Convergence Analysis and Network Scaling
- Distributed primal–dual stochastic subgradient algorithms for multi‐agent optimization under inequality constraints
This page was built for publication: Asynchronous gossip-based gradient-free method for multiagent optimization