Efficiency of higher-order algorithms for minimizing composite functions

From MaRDI portal
Publication:6155068

DOI10.1007/S10589-023-00533-9arXiv2203.13367MaRDI QIDQ6155068FDOQ6155068


Authors: Yassine Nabou, Ion Necoara Edit this on Wikidata


Publication date: 16 February 2024

Published in: Computational Optimization and Applications (Search for Journal in Brave)

Abstract: Composite minimization involves a collection of functions which are aggregated in a nonsmooth manner. It covers, as a particular case, optimization problems with functional constraints, minimization of max-type functions, and simple composite minimization problems, where the objective function has a nonsmooth component. We design a higher-order majorization algorithmic framework for fully composite problems (possibly nonconvex). Our framework replaces each component with a higher-order surrogate such that the corresponding error function has a higher-order Lipschitz continuous derivative. We present convergence guarantees for our method for composite optimization problems with (non)convex and (non)smooth objective function. In particular, we prove stationary point convergence guarantees for general nonconvex (possibly nonsmooth) problems and under Kurdyka-Lojasiewicz (KL) property of the objective function we derive improved rates depending on the KL parameter. For convex (possibly nonsmooth) problems we also provide sublinear convergence rates.


Full work available at URL: https://arxiv.org/abs/2203.13367




Recommendations




Cites Work


Cited In (1)





This page was built for publication: Efficiency of higher-order algorithms for minimizing composite functions

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6155068)