Proximal variable smoothing method for three-composite nonconvex nonsmooth minimization with a linear operator
From MaRDI portal
Publication:6126596
DOI10.1007/s11075-023-01645-3MaRDI QIDQ6126596
Publication date: 9 April 2024
Published in: Numerical Algorithms (Search for Journal in Brave)
Convex programming (90C25) Nonlinear programming (90C30) Numerical optimization and variational techniques (65K10)
Cites Work
- Nearly unbiased variable selection under minimax concave penalty
- A three-operator splitting scheme and its optimization applications
- Variable smoothing for weakly convex composite functions
- Variable smoothing incremental aggregated gradient method for nonsmooth nonconvex regularized optimization
- Accelerating incremental gradient optimization with curvature information
- Variable smoothing for convex optimization problems using stochastic gradients
- Nonconvex proximal incremental aggregated gradient method with linear convergence
- A variable smoothing algorithm for solving convex optimization problems
- An envelope for Davis-Yin splitting and strict saddle-point avoidance
- Incremental Subgradient Methods for Nondifferentiable Optimization
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Projecting onto the Intersection of a Cone and a Sphere
- First-Order Methods in Optimization
- Global Convergence Rate of Proximal Incremental Aggregated Gradient Methods
- Surpassing Gradient Descent Provably: A Cyclic Incremental Method with Linear Convergence Rate
- Sparsity and Smoothness Via the Fused Lasso
- Proximal-Like Incremental Aggregated Gradient Method with Linear Convergence Under Bregman Distance Growth Conditions
- A Three-Operator Splitting Algorithm for Nonconvex Sparsity Regularization
- Convergence Rate of Incremental Gradient and Incremental Newton Methods
- On the Convergence Rate of Incremental Aggregated Gradient Algorithms
- Stochastic First- and Zeroth-Order Methods for Nonconvex Stochastic Programming
- A Convergent Incremental Gradient Method with a Constant Step Size
- Linear convergence of proximal incremental aggregated gradient method for nonconvex nonsmooth minimization problems
- Mini-batch stochastic approximation methods for nonconvex stochastic composite optimization
- Unnamed Item
- Unnamed Item
This page was built for publication: Proximal variable smoothing method for three-composite nonconvex nonsmooth minimization with a linear operator