Convergence rate of a relaxed inertial proximal algorithm for convex minimization
DOI10.1080/02331934.2019.1696337zbMath1440.49014OpenAlexW2993347174MaRDI QIDQ5110325
Publication date: 18 May 2020
Published in: Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/02331934.2019.1696337
relaxationLyapunov analysisnonsmooth convex minimizationmaximally monotone operatorsinertial proximal method
Numerical mathematical programming methods (65K05) Convex programming (90C25) Numerical optimization and variational techniques (65K10) Numerical methods based on nonlinear programming (49M37) Methods involving semicontinuity and convergence; relaxation (49J45)
Related Items (10)
Cites Work
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Optimized first-order methods for smooth convex minimization
- On the convergence of the iterates of the ``fast iterative shrinkage/thresholding algorithm
- Convergence theorems for inertial KM-type algorithms
- On the Douglas-Rachford splitting method and the proximal point algorithm for maximal monotone operators
- Introductory lectures on convex optimization. A basic course.
- Convergence of damped inertial dynamics governed by regularized maximally monotone operators
- Inertial forward-backward algorithms with perturbations: application to Tikhonov regularization
- Convergence of inertial dynamics and proximal algorithms governed by maximally monotone operators
- On the proximal gradient algorithm with alternated inertia
- Performance of first-order methods for smooth convex minimization: a novel approach
- Convergence of a relaxed inertial forward-backward algorithm for structured monotone inclusions
- On damped second-order gradient systems
- Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity
- Second Order Forward-Backward Dynamical Systems For Monotone Inclusion Problems
- The Rate of Convergence of Nesterov's Accelerated Forward-Backward Method is Actually Faster Than $1/k^2$
- A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
- Accelerated and Inexact Forward-Backward Algorithms
- Convex Optimization in Normed Spaces
- Stability of Over-Relaxations for the Forward-Backward Algorithm, Application to FISTA
- On the Minimizing Property of a Second Order Dissipative System in Hilbert Spaces
- Convergence Rates of Inertial Forward-Backward Algorithms
- A generic online acceleration scheme for optimization algorithms via relaxation and inertia
- Weak Convergence of a Relaxed and Inertial Hybrid Projection-Proximal Point Algorithm for Maximal Monotone Operators in Hilbert Space
- Optimal Convergence Rates for Nesterov Acceleration
- Some methods of speeding up the convergence of iteration methods
- Weak convergence of the sequence of successive approximations for nonexpansive mappings
- Convex analysis and monotone operator theory in Hilbert spaces
- An inertial proximal method for maximal monotone operators via discretization of a nonlinear oscillator with damping
This page was built for publication: Convergence rate of a relaxed inertial proximal algorithm for convex minimization