Private and Robust Distributed Nonconvex Optimization via Polynomial Approximation
From MaRDI portal
Publication:6505046
DOI10.1109/TCNS.2024.3354875arXiv2101.06127WikidataQ129773197 ScholiaQ129773197MaRDI QIDQ6505046FDOQ6505046
Authors: Zhiyu He, Jianping He, Cailian Chen, Xinping Guan
Abstract: There has been work that exploits polynomial approximation to solve distributed nonconvex optimization problems involving univariate objectives. This idea facilitates arbitrarily precise global optimization without requiring local evaluations of gradients at every iteration. Nonetheless, there remains a gap between existing theoretical guarantees and diverse practical requirements, e.g., privacy preservation and robustness to network imperfections. To fill this gap and keep the above strengths, we propose a Private and Robust Chebyshev-Proxy-based distributed Optimization Algorithm (PR-CPOA). Specifically, to ensure both accuracy of solutions and privacy of local objectives, we design a new privacy-preserving mechanism. This mechanism leverages the randomness in blockwise insertions of perturbed vector states and hence provides an improved privacy guarantee in the scope of ()-data-privacy. Furthermore, to gain robustness against various network imperfections, we use the push-sum consensus protocol as a backbone, discuss its specific enhancements, and evaluate the performance of the proposed algorithm accordingly. Thanks to the purely consensus-type iterations, we avoid the privacy-accuracy trade-off and the bother of selecting appropriate step-sizes in different settings. We provide rigorous analysis of the accuracy, privacy, and complexity. It is shown that the advantages brought by the idea of polynomial approximation are maintained when all the above requirements exist.
This page was built for publication: Private and Robust Distributed Nonconvex Optimization via Polynomial Approximation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6505046)