Distributed Personalized Gradient Tracking With Convex Parametric Models
From MaRDI portal
Publication:6137545
DOI10.1109/TAC.2022.3147007arXiv2008.04363MaRDI QIDQ6137545FDOQ6137545
Ivano Notarnicola, Giuseppe Notarstefano, Andrea Simonetto, Francesco Farina
Publication date: 4 September 2023
Published in: IEEE Transactions on Automatic Control (Search for Journal in Brave)
Abstract: We present a distributed optimization algorithm for solving online personalized optimization problems over a network of computing and communicating nodes, each of which linked to a specific user. The local objective functions are assumed to have a composite structure and to consist of a known time-varying (engineering) part and an unknown (user-specific) part. Regarding the unknown part, it is assumed to have a known parametric (e.g., quadratic) structure a priori, whose parameters are to be learned along with the evolution of the algorithm. The algorithm is composed of two intertwined components: (i) a dynamic gradient tracking scheme for finding local solution estimates and (ii) a recursive least squares scheme for estimating the unknown parameters via user's noisy feedback on the local solution estimates. The algorithm is shown to exhibit a bounded regret under suitable assumptions. Finally, a numerical example corroborates the theoretical analysis.
Full work available at URL: https://arxiv.org/abs/2008.04363
Recommendations
- Distributed stochastic gradient tracking methods
- Gradient-tracking based differentially private distributed optimization with enhanced optimization accuracy
- Distributed Gradient Tracking for Unbalanced Optimization With Different Constraint Sets
- Distributed stochastic gradient tracking methods with momentum acceleration for non-convex optimization
- Distributed gradient tracking methods with finite data rates
- Triggered gradient tracking for asynchronous distributed optimization
- Distributed consensus-based multi-agent convex optimization via gradient tracking technique
- Distributed Optimization Based on Gradient Tracking Revisited: Enhancing Convergence Rate via Surrogation
- Tracking-ADMM for distributed constraint-coupled optimization
- Gradient-free distributed optimization with exact convergence
This page was built for publication: Distributed Personalized Gradient Tracking With Convex Parametric Models
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6137545)