Scalable subspace methods for derivative-free nonlinear least-squares optimization

From MaRDI portal
Publication:6038650

DOI10.1007/S10107-022-01836-1arXiv2102.12016OpenAlexW3133264924MaRDI QIDQ6038650FDOQ6038650


Authors: Coralia Cartis, Lindon Roberts Edit this on Wikidata


Publication date: 2 May 2023

Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)

Abstract: We introduce a general framework for large-scale model-based derivative-free optimization based on iterative minimization within random subspaces. We present a probabilistic worst-case complexity analysis for our method, where in particular we prove high-probability bounds on the number of iterations before a given optimality is achieved. This framework is specialized to nonlinear least-squares problems, with a model-based framework based on the Gauss-Newton method. This method achieves scalability by constructing local linear interpolation models to approximate the Jacobian, and computes new steps at each iteration in a subspace with user-determined dimension. We then describe a practical implementation of this framework, which we call DFBGN. We outline efficient techniques for selecting the interpolation points and search subspace, yielding an implementation that has a low per-iteration linear algebra cost (linear in the problem dimension) while also achieving fast objective decrease as measured by evaluations. Extensive numerical results demonstrate that DFBGN has improved scalability, yielding strong performance on large-scale nonlinear least-squares problems.


Full work available at URL: https://arxiv.org/abs/2102.12016







Cites Work


Cited In (13)





This page was built for publication: Scalable subspace methods for derivative-free nonlinear least-squares optimization

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6038650)