Difference of convex algorithms for bilevel programs with applications in hyperparameter selection

From MaRDI portal
Publication:6360897

DOI10.1007/S10107-022-01888-3arXiv2102.09006WikidataQ114228401 ScholiaQ114228401MaRDI QIDQ6360897FDOQ6360897

Xiaoming Yuan, Jin Zhang, Jane J. Ye, Shangzhi Zeng

Publication date: 17 February 2021

Abstract: In this paper, we present difference of convex algorithms for solving bilevel programs in which the upper level objective functions are difference of convex functions, and the lower level programs are fully convex. This nontrivial class of bilevel programs provides a powerful modelling framework for dealing with applications arising from hyperparameter selection in machine learning. Thanks to the full convexity of the lower level program, the value function of the lower level program turns out to be convex and hence the bilevel program can be reformulated as a difference of convex bilevel program. We propose two algorithms for solving the reformulated difference of convex program and show their convergence under very mild assumptions. Finally we conduct numerical experiments to a bilevel model of support vector machine classification.













This page was built for publication: Difference of convex algorithms for bilevel programs with applications in hyperparameter selection

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6360897)