ZO-JADE: Zeroth-order Curvature-Aware Multi-Agent Convex Optimization

From MaRDI portal
Publication:6429428

DOI10.1109/LCSYS.2023.3281745arXiv2303.07450MaRDI QIDQ6429428FDOQ6429428


Authors: Alessio Maritan, Luca Schenato Edit this on Wikidata


Publication date: 13 March 2023

Abstract: In this work we address the problem of convex optimization in a multi-agent setting where the objective is to minimize the mean of local cost functions whose derivatives are not available (e.g. black-box models). Moreover agents can only communicate with local neighbors according to a connected network topology. Zeroth-order (ZO) optimization has recently gained increasing attention in federated learning and multi-agent scenarios exploiting finite-difference approximations of the gradient using from 2 (directional gradient) to 2d (central difference full gradient) evaluations of the cost functions, where d is the dimension of the problem. The contribution of this work is to extend ZO distributed optimization by estimating the curvature of the local cost functions via finite-difference approximations. In particular, we propose a novel algorithm named ZO-JADE, that by adding just one extra point, i.e. 2d+1 in total, allows to simultaneously estimate the gradient and the diagonal of the local Hessian, which are then combined via average tracking consensus to obtain an approximated Jacobi descent. Guarantees of semi-global exponential stability are established via separation of time-scales. Extensive numerical experiments on real-world data confirm the efficiency and superiority of our algorithm with respect to several other distributed zeroth-order methods available in the literature based on only gradient estimates.













This page was built for publication: ZO-JADE: Zeroth-order Curvature-Aware Multi-Agent Convex Optimization

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6429428)