Convergence analysis of the fast subspace descent method for convex optimization problems

From MaRDI portal
Publication:5113667

DOI10.1090/MCOM/3526zbMATH Open1442.65426arXiv1810.04116OpenAlexW3005875771MaRDI QIDQ5113667FDOQ5113667

Long Chen, X. Hu, Steven Wise

Publication date: 15 June 2020

Published in: Mathematics of Computation (Search for Journal in Brave)

Abstract: The full approximation storage (FAS) scheme is a widely used multigrid method for nonlinear problems. In this paper, a new framework to design and analyze FAS-like schemes for convex optimization problems is developed. The new method, the Fast Subspace Descent (FASD) scheme, which generalizes classical FAS, can be recast as an inexact version of nonlinear multigrid methods based on space decomposition and subspace correction. The local problem in each subspace can be simplified to be linear and one gradient descent iteration (with an appropriate step size) is enough to ensure a global linear (geometric) convergence of FASD.


Full work available at URL: https://arxiv.org/abs/1810.04116




Recommendations




Cites Work


Cited In (9)

Uses Software





This page was built for publication: Convergence analysis of the fast subspace descent method for convex optimization problems

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5113667)