A minimization method for the sum of a convex function and a continuously differentiable function
From MaRDI portal
Publication:1134011
DOI10.1007/BF00935173zbMath0422.90070OpenAlexW2053386824MaRDI QIDQ1134011
Publication date: 1981
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/bf00935173
convergence analysiscritical pointsdescent methodsubgradientsnonconvex nondifferentiable optimization problems
Related Items (29)
An alternating linearization bundle method for a class of nonconvex nonsmooth optimization problems ⋮ Asynchronous variance-reduced block schemes for composite non-convex stochastic optimization: block-specific steplengths and adapted batch-sizes ⋮ \(l_1\)-\(l_2\) regularization of split feasibility problems ⋮ A parallel descent algorithm for convex programming ⋮ A proximal alternating linearization method for nonconvex optimization problems ⋮ Inexact partial linearization methods for network equilibrium problems ⋮ A New Boosted Proximal Point Algorithm for Minimizing Nonsmooth DC Functions ⋮ Combined methods for dynamic spatial auction market models ⋮ An adaptive partial linearization method for optimization problems on product sets ⋮ The boosted DC algorithm for linearly constrained DC programming ⋮ Affine Invariant Convergence Rates of the Conditional Gradient Method ⋮ Short paper -- A note on the Frank-Wolfe algorithm for a class of nonconvex and nonsmooth optimization problems ⋮ Composite Convex Minimization Involving Self-concordant-Like Cost Functions ⋮ A successive quadratic programming method for a class of constrained nonsmooth optimization problems ⋮ Accelerating the DC algorithm for smooth functions ⋮ A coordinate gradient descent method for nonsmooth separable minimization ⋮ An introduction to continuous optimization for imaging ⋮ A proximal-point SQP trust region method for solving some special class of nonlinear semi-definite programming problems ⋮ A sequential partial linearization algorithm for the symmetric eigenvalue complementarity problem ⋮ Stochastic proximal quasi-Newton methods for non-convex composite optimization ⋮ Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization ⋮ The Boosted Difference of Convex Functions Algorithm for Nonsmooth Functions ⋮ Make \(\ell_1\) regularization effective in training sparse CNN ⋮ Hybrid methods for network equilibrium problems ⋮ A method for minimizing the sum of a convex function and a continuously differentiable function ⋮ Efficient Boosted DC Algorithm for Nonconvex Image Restoration with Rician Noise ⋮ Partial linearization methods in nonlinear programming ⋮ Nomonotone spectral gradient method for sparse recovery ⋮ Submonotone mappings and the proximal point algorithm
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Generalized Gradients and Applications
- Necessary and sufficient conditions for a penalty method to be exact
- Semismooth and Semiconvex Functions in Constrained Optimization
- Optimization of lipschitz continuous functions
- An Algorithm for Constrained Optimization with Semismooth Functions
- A New Approach to Lagrange Multipliers
- Non-Linear Programming Via Penalty Functions
- An Exact Potential Method for Constrained Maxima
- Convex Analysis
- Minimization of unsmooth functionals
- New Conditions for Exactness of a Simple Penalty Function
- Constrained Optimization Using a Nondifferentiable Penalty Function
- Exact penalty functions in nonlinear programming
- Sufficient conditions for extremum, penalty functions and regularity
This page was built for publication: A minimization method for the sum of a convex function and a continuously differentiable function