Analysis of generalized Bregman surrogate algorithms for nonsmooth nonconvex statistical learning

From MaRDI portal
Publication:2073715

DOI10.1214/21-AOS2090zbMATH Open1486.62212arXiv2112.09191OpenAlexW4200303272MaRDI QIDQ2073715FDOQ2073715


Authors: Yiyuan She, Zhifeng Wang, Jiuwu Jin Edit this on Wikidata


Publication date: 7 February 2022

Published in: The Annals of Statistics (Search for Journal in Brave)

Abstract: Modern statistical applications often involve minimizing an objective function that may be nonsmooth and/or nonconvex. This paper focuses on a broad Bregman-surrogate algorithm framework including the local linear approximation, mirror descent, iterative thresholding, DC programming and many others as particular instances. The recharacterization via generalized Bregman functions enables us to construct suitable error measures and establish global convergence rates for nonconvex and nonsmooth objectives in possibly high dimensions. For sparse learning problems with a composite objective, under some regularity conditions, the obtained estimators as the surrogate's fixed points, though not necessarily local minimizers, enjoy provable statistical guarantees, and the sequence of iterates can be shown to approach the statistical truth within the desired accuracy geometrically fast. The paper also studies how to design adaptive momentum based accelerations without assuming convexity or smoothness by carefully controlling stepsize and relaxation parameters.


Full work available at URL: https://arxiv.org/abs/2112.09191




Recommendations




Cites Work


Cited In (2)

Uses Software





This page was built for publication: Analysis of generalized Bregman surrogate algorithms for nonsmooth nonconvex statistical learning

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2073715)