On Convergence Rates of Linearized Proximal Algorithms for Convex Composite Optimization with Applications
Publication:2810547
DOI10.1137/140993090zbMath1338.65159OpenAlexW2398068802MaRDI QIDQ2810547
Chong Li, Yao-Hua Hu, Xiao Qi Yang
Publication date: 3 June 2016
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://semanticscholar.org/paper/f93105648b3ea56d337c196c94b3862c956bea27
weak sharp minimafeasibility problemconvex composite optimizationsensor network localizationlinearized proximal algorithmquasi-regularity condition
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26)
Related Items (30)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A proximal method for composite minimization
- Composite proximal bundle method
- Gradient methods for minimizing composite functions
- Strong KKT conditions and weak sharp solutions in convex-composite optimization
- Convergence analysis of the Gauss-Newton method for convex inclusion and convex-composite optimization problems
- Tame functions are semismooth
- Error bounds in mathematical programming
- A Gauss-Newton method for convex composite optimization
- A nonsmooth version of Newton's method
- Optimization theory and methods. Nonlinear programming
- Inexact subgradient methods for quasi-convex optimization problems
- A Proximal-Gradient Homotopy Method for the Sparse Least-Squares Problem
- On the $O(1/n)$ Convergence Rate of the Douglas–Rachford Alternating Direction Method
- Weak Sharp Minima in Mathematical Programming
- Weak Sharp Minima for Semi-infinite Optimization Problems with Applications
- Majorizing Functions and Convergence of the Gauss–Newton Method for Convex Composite Optimization
- Further Relaxations of the Semidefinite Programming Approach to Sensor Network Localization
- Local properties of algorithms for minimizing nonsmooth composite functions
- Conditions for convergence of trust region algorithms for nonsmooth optimization
- Descent methods for composite nondifferentiable optimization problems
- Second order necessary and sufficient conditions for convex composite NDO
- First- and Second-Order Epi-Differentiability in Nonlinear Programming
- Stability Theory for Systems of Inequalities, Part II: Differentiable Nonlinear Systems
- Monotone Operators and the Proximal Point Algorithm
- Normalized Incremental Subgradient Algorithm and Its Application
- On Projection Algorithms for Solving Convex Feasibility Problems
- Second-order Sufficiency and Quadratic Growth for Nonisolated Minima
- Weak Sharp Minima: Characterizations and Sufficient Conditions
- Alternating Projections on Manifolds
- On convergence of the Gauss-Newton method for convex composite optimization.
This page was built for publication: On Convergence Rates of Linearized Proximal Algorithms for Convex Composite Optimization with Applications