Additive Schwarz methods for convex optimization with backtracking
From MaRDI portal
Abstract: This paper presents a novel backtracking strategy for additive Schwarz methods for general convex optimization problems as an acceleration scheme. The proposed backtracking strategy is independent of local solvers, so that it can be applied to any algorithms that can be represented in an abstract framework of additive Schwarz methods. Allowing for adaptive increasing and decreasing of the step size along the iterations, the convergence rate of an algorithm is greatly improved. Improved convergence rate of the algorithm is proven rigorously. In addition, combining the proposed backtracking strategy with a momentum acceleration technique, we propose a further accelerated additive Schwarz method. Numerical results for various convex optimization problems that support our theory are presented.
Recommendations
- Accelerated additive Schwarz methods for convex optimization with adaptive restart
- Additive Schwarz methods for convex optimization as gradient methods
- Backtracking strategies for accelerated descent methods with smooth composite objectives
- Fast first-order methods for composite convex optimization with backtracking
- Convergence properties of projected gradient methods with nonmonotonic back tracking technique for convex constrained optimization
Cites work
- scientific article; zbMATH DE number 1981901 (Why is no real title available?)
- scientific article; zbMATH DE number 2113718 (Why is no real title available?)
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A block coordinate descent method for regularized multiconvex optimization with applications to nonnegative tensor factorization and completion
- A finite element approach for the dual Rudin-Osher-Fatemi model and its nonoverlapping domain decomposition methods
- A sharp convergence estimate for the method of subspace corrections for singular systems of equations
- A simple nearly optimal restart scheme for speeding up first-order methods
- A simplified view of first order methods for optimization
- Accelerated additive Schwarz methods for convex optimization with adaptive restart
- Accelerated non-overlapping domain decomposition method for total variation minimization
- Adaptive restart for accelerated gradient schemes
- Additive Schwarz methods for convex optimization as gradient methods
- An overlapping Schwarz method for virtual element discretizations in two dimensions
- Backtracking strategies for accelerated descent methods with smooth composite objectives
- Convergence Rate Analysis of a Multiplicative Schwarz Method for Variational Inequalities
- Convergence Rate of a Schwarz Multilevel Method for the Constrained Minimization of Nonquadratic Functionals
- Convergence analysis of generalized Schwarz algorithms for solving obstacle problems with T-monotone operator
- Convergence analysis of the fast subspace descent method for convex optimization problems
- Convergence rate of overlapping domain decomposition methods for the Rudin-Osher-Fatemi model based on a dual formulation
- Deep learning
- Fast first-order methods for composite convex optimization with backtracking
- Fast nonoverlapping block Jacobi method for the dual Rudin-Osher-Fatemi model
- Global and uniform convergence of subspace correction methods for some convex optimization problems
- Gradient methods for minimizing composite functions
- Iterative Methods by Space Decomposition and Subspace Correction
- KSPHPDDM and PCHPDDM: extending PETSc with advanced Krylov methods and robust multilevel overlapping Schwarz preconditioners
- Minimization of functions having Lipschitz continuous first partial derivatives
- On the convergence of block coordinate descent type methods
- One- and two-level Schwarz methods for variational inequalities of the second kind and their application to frictional contact
- Overlapping Schwarz preconditioners for isogeometric collocation methods
- Overlapping additive Schwarz preconditioners for isogeometric collocation discretizations of linear elasticity
- Overlapping domain decomposition methods for total variation denoising
- Proximal alternating linearized minimization for nonconvex and nonsmooth problems
- Pseudo-linear convergence of an additive Schwarz method for dual total variation minimization
- Rate of convergence for some constraint decomposition methods for nonlinear variational inequalities
- Sharpness, restart, and acceleration
- The method of alternating projections and the method of subspace corrections in Hilbert space
- The Łojasiewicz Inequality for Nonsmooth Subanalytic Functions with Applications to Subgradient Dynamical Systems
Cited in
(8)- Additive Schwarz methods for convex optimization as gradient methods
- Accelerated additive Schwarz methods for convex optimization with adaptive restart
- Backtracking strategies for accelerated descent methods with smooth composite objectives
- On the convergence of broadcast incremental algorithms with applications
- Additive Schwarz methods for fourth-order variational inequalities
- Fast non-overlapping domain decomposition methods for continuous multi-phase labeling problem
- Additive Schwarz methods for semilinear elliptic problems with convex energy functionals: convergence rate independent of nonlinearity
- Additive Schwarz algorithm for the nonlinear complementarity problem with \(M\)-function
This page was built for publication: Additive Schwarz methods for convex optimization with backtracking
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2122660)