Incremental Quasi-Newton Algorithms for Solving Nonconvex, Nonsmooth, Finite-Sum Optimization Problems

From MaRDI portal
Publication:6405530




Abstract: Algorithms for solving nonconvex, nonsmooth, finite-sum optimization problems are proposed and tested. In particular, the algorithms are proposed and tested in the context of an optimization problem formulation arising in semi-supervised machine learning. The common feature of all algorithms is that they employ an incremental quasi-Newton (IQN) strategy, specifically an incremental BFGS (IBFGS) strategy. One applies an IBFGS strategy to the problem directly, whereas the others apply an IBFGS strategy to a difference-of-convex reformulation, smoothed approximation, or (strongly) convex local approximation. Experiments show that all IBFGS approaches fare well in practice, and all outperform a state-of-the-art bundle method.











This page was built for publication: Incremental Quasi-Newton Algorithms for Solving Nonconvex, Nonsmooth, Finite-Sum Optimization Problems

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6405530)