Distributed Optimization with Global Constraints Using Noisy Measurements
From MaRDI portal
Publication:6370296
arXiv2106.07703MaRDI QIDQ6370296FDOQ6370296
Authors: Van Sy Mai, Richard J. La, Tao Zhang, Abdella Battou
Publication date: 14 June 2021
Abstract: We propose a new distributed optimization algorithm for solving a class of constrained optimization problems in which (a) the objective function is separable (i.e., the sum of local objective functions of agents), (b) the optimization variables of distributed agents, which are subject to nontrivial local constraints, are coupled by global constraints, and (c) only noisy observations are available to estimate (the gradients of) local objective functions. In many practical scenarios, agents may not be willing to share their optimization variables with others. For this reason, we propose a distributed algorithm that does not require the agents to share their optimization variables with each other; instead, each agent maintains a local estimate of the global constraint functions and share the estimate only with its neighbors. These local estimates of constraint functions are updated using a consensus-type algorithm, while the local optimization variables of each agent are updated using a first-order method based on noisy estimates of gradient. We prove that, when the agents adopt the proposed algorithm, their optimization variables converge with probability 1 to an optimal point of an approximated problem based on the penalty method.
This page was built for publication: Distributed Optimization with Global Constraints Using Noisy Measurements
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6370296)