Hierarchical Convex Optimization by the Hybrid Steepest Descent Method with Proximal Splitting Operators -- Enhancements of SVM and Lasso

From MaRDI portal
Publication:6403588

DOI10.1007/978-3-030-25939-6_16arXiv2206.15019MaRDI QIDQ6403588FDOQ6403588


Authors: Isao Yamada, Masao Yamagishi Edit this on Wikidata


Publication date: 30 June 2022

Abstract: The breakthrough ideas in the modern proximal splitting methodologies allow us to express the set of all minimizers of a superposition of multiple nonsmooth convex functions as the fixed point set of computable nonexpansive operators. In this paper, we present practical algorithmic strategies for the hierarchical convex optimization problems which require further strategic selection of a most desirable vector from the solution set of the standard convex optimization. The proposed algorithms are established by applying the hybrid steepest descent method to special nonexpansive operators designed through the art of proximal splitting. We also present applications of the proposed strategies to certain unexplored hierarchical enhancements of the support vector machine and the Lasso estimator.













This page was built for publication: Hierarchical Convex Optimization by the Hybrid Steepest Descent Method with Proximal Splitting Operators -- Enhancements of SVM and Lasso

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6403588)