Efficiency of higher-order algorithms for minimizing composite functions

From MaRDI portal
Publication:6155068




Abstract: Composite minimization involves a collection of functions which are aggregated in a nonsmooth manner. It covers, as a particular case, optimization problems with functional constraints, minimization of max-type functions, and simple composite minimization problems, where the objective function has a nonsmooth component. We design a higher-order majorization algorithmic framework for fully composite problems (possibly nonconvex). Our framework replaces each component with a higher-order surrogate such that the corresponding error function has a higher-order Lipschitz continuous derivative. We present convergence guarantees for our method for composite optimization problems with (non)convex and (non)smooth objective function. In particular, we prove stationary point convergence guarantees for general nonconvex (possibly nonsmooth) problems and under Kurdyka-Lojasiewicz (KL) property of the objective function we derive improved rates depending on the KL parameter. For convex (possibly nonsmooth) problems we also provide sublinear convergence rates.



Cites work







This page was built for publication: Efficiency of higher-order algorithms for minimizing composite functions

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6155068)