A unified convergence rate analysis of the accelerated smoothed gap reduction algorithm

From MaRDI portal
Publication:2128772

DOI10.1007/S11590-021-01775-4zbMATH Open1491.90182arXiv2104.01314OpenAlexW3178689845MaRDI QIDQ2128772FDOQ2128772


Authors: Quoc Tran Dinh Edit this on Wikidata


Publication date: 22 April 2022

Published in: Optimization Letters (Search for Journal in Brave)

Abstract: In this paper, we develop a unified convergence analysis framework for the Accelerated Smoothed GAp ReDuction algorithm (ASGARD) introduced in [20, Tran-Dinh et al, 2015] Unlike[20], the new analysis covers three settings in a single algorithm: general convexity, strong convexity, and strong convexity and smoothness. Moreover, we establish the convergence guarantees on three criteria: (i) gap function, (ii) primal objective residual, and (iii) dual objective residual. Our convergence rates are optimal (up to a constant factor) in all cases. While the convergence rate on the primal objective residual for the general convex case has been established in [20], we prove additional convergence rates on the gap function and the dual objective residual. The analysis for the last two cases is completely new. Our results provide a complete picture on the convergence guarantees of ASGARD. Finally, we present four different numerical experiments on a representative optimization model to verify our algorithm and compare it with Nesterov's smoothing technique.


Full work available at URL: https://arxiv.org/abs/2104.01314




Recommendations




Cites Work


Cited In (5)





This page was built for publication: A unified convergence rate analysis of the accelerated smoothed gap reduction algorithm

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2128772)