Descentwise inexact proximal algorithms for smooth optimization
From MaRDI portal
Publication:1935582
DOI10.1007/S10589-012-9461-3zbMath1264.90160OpenAlexW2021789026MaRDI QIDQ1935582
Marc Fuentes, Jérôme Malick, Claude Lemaréchal
Publication date: 18 February 2013
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10589-012-9461-3
Related Items (14)
On the resolution of the variational inequalities of the first and the second kind as equations obtained by explicit Moreau-Yosida regularizations ⋮ Principled analyses and design of first-order methods with inexact proximal operators ⋮ On the proximal gradient algorithm with alternated inertia ⋮ An inexact proximal regularization method for unconstrained optimization ⋮ Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice ⋮ An inexact and nonmonotone proximal method for smooth unconstrained minimization ⋮ Local convergence analysis of a primal-dual method for bound-constrained optimization without SOSC ⋮ Dual descent methods as tension reduction systems ⋮ Distributed Learning with Sparse Communications by Identification ⋮ A globally and quadratically convergent algorithm with efficient implementation for unconstrained optimization ⋮ On the convergence of a multigrid method for Moreau-regularized variational inequalities of the second kind ⋮ An Inexact Variable Metric Proximal Point Algorithm for Generic Quasi-Newton Acceleration ⋮ Algebraic rules for quadratic regularization of Newton's method ⋮ A Proximal Bundle Variant with Optimal Iteration-Complexity for a Large Range of Prox Stepsizes
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Some numerical experiments with variable-storage quasi-Newton algorithms
- A proximal approach to the inversion of ill-conditioned matrices
- Convergence of some algorithms for convex minimization
- Variable metric bundle methods: From conceptual to implementable forms
- A nonsmooth version of Newton's method
- Self-adaptive inexact proximal point methods
- An effective algorithm for minimization
- Minimization of functions having Lipschitz continuous first partial derivatives
- Inexact proximal point algorithms and descent methods in optimization
- A UNIFIED FRAMEWORK FOR SOME INEXACT PROXIMAL POINT ALGORITHMS*
- An Algorithm for Least-Squares Estimation of Nonlinear Parameters
- Updating Quasi-Newton Matrices with Limited Storage
- Augmented Lagrangians and Applications of the Proximal Point Algorithm in Convex Programming
- On Convergence Properties of Algorithms for Unconstrained Minimization
- CUTE
- Convergence Conditions for Ascent Methods
- A method for the solution of certain non-linear problems in least squares
This page was built for publication: Descentwise inexact proximal algorithms for smooth optimization