Schwarz iterative methods: infinite space splittings
From MaRDI portal
Publication:530579
DOI10.1007/s00365-015-9318-yzbMath1356.65145arXiv1501.00938OpenAlexW2962732332MaRDI QIDQ530579
Publication date: 10 August 2016
Published in: Constructive Approximation (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1501.00938
convergenceHilbert space splittingsalternating directions methoddecay rate estimationelliptic variational problemgreedy splittingmultiplicative Schwarz methodquadratic minimizationrandomized splitting
Variational and other types of inequalities involving nonlinear operators (general) (47J20) Iterative procedures involving nonlinear operators (47J25) Numerical solutions to equations with nonlinear operators (65J15)
Related Items
Stochastic subspace correction in Hilbert space ⋮ Rates of convergence of randomized Kaczmarz algorithms in Hilbert spaces
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Convex optimization on Banach spaces
- Inexact coordinate descent: complexity and preconditioning
- Greedy and randomized versions of the multiplicative Schwarz method
- Relaxation in greedy approximation
- A randomized Kaczmarz algorithm with exponential convergence
- A simple lemma on greedy approximation in Hilbert space and convergence rates for projection pursuit regression and neural network training
- On the convergence of the coordinate descent method for convex differentiable minimization
- On the convergence rate of SOR: A worst case estimate
- Nonlinear methods of approximation
- On the abstract theory of additive and multiplicative Schwarz algorithms
- Some remarks on greedy algorithms
- Greedy approximation in convex optimization
- Convergence analysis for Kaczmarz-type methods in a Hilbert space framework
- Greedy strategies for convex optimization
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Approximation and learning by greedy algorithms
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Dual Ascent Methods for Problems with Strictly Convex Costs and Linear Constraints: A Unified Approach
- Greedy Approximation
- Randomized Methods for Linear Constraints: Convergence Rates and Conditioning
- Greedy approximation
- Iterative Methods by Space Decomposition and Subspace Correction
- Universal approximation bounds for superpositions of a sigmoidal function
- Sequential greedy approximation for certain convex optimization problems
- On the Convergence of Block Coordinate Descent Type Methods
- An introduction to frames and Riesz bases
- Convergence of a block coordinate descent method for nondifferentiable minimization