Accelerated methods with fastly vanishing subgradients for structured non-smooth minimization
From MaRDI portal
Publication:2129627
DOI10.1007/s11075-021-01181-yzbMath1491.90119OpenAlexW3197182085MaRDI QIDQ2129627
Florian Labarre, Paul-Emile Maingé
Publication date: 22 April 2022
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-021-01181-y
non-smooth minimizationglobal rate of convergenceinertial-type algorithmfast first-order methodNesterov-type algorithmstructured minimization
Ill-posedness and regularization problems in numerical linear algebra (65F22) Convex programming (90C25) Large-scale problems in mathematical programming (90C06)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Fast convex optimization via inertial dynamics with Hessian driven damping
- Gradient methods for minimizing composite functions
- Fast first-order methods for composite convex optimization with backtracking
- On the convergence of the iterates of the ``fast iterative shrinkage/thresholding algorithm
- Convergence in norm of modified Krasnoselski-Mann iterations for fixed points of demicontractive mappings
- An inertial forward-backward algorithm for monotone inclusions
- Ergodic convergence to a zero of the sum of monotone operators in Hilbert space
- A second-order gradient-like dissipative dynamical system with Hessian-driven damping. Application to optimization and mechanics.
- Convergence of a splitting inertial proximal method for monotone operators
- Convergence rate of inertial forward-backward algorithm beyond Nesterov's rule
- Convergence of a relaxed inertial forward-backward algorithm for structured monotone inclusions
- Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity
- The Rate of Convergence of Nesterov's Accelerated Forward-Backward Method is Actually Faster Than $1/k^2$
- On the convergence of the forward–backward splitting method with linesearches
- A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
- Splitting Algorithms for the Sum of Two Nonlinear Operators
- On the Convergence of the Proximal Point Algorithm for Convex Minimization
- New Proximal Point Algorithms for Convex Minimization
- Convergence Rates of Inertial Forward-Backward Algorithms
- A generic online acceleration scheme for optimization algorithms via relaxation and inertia
- Asymptotic for a second-order evolution equation with convex potential andvanishing damping term
- Weak convergence of the sequence of successive approximations for nonexpansive mappings
This page was built for publication: Accelerated methods with fastly vanishing subgradients for structured non-smooth minimization