Proximal gradient/semismooth Newton methods for projection onto a polyhedron via the duality-gap-active-set strategy
From MaRDI portal
Publication:6176299
DOI10.1007/s10915-023-02302-6OpenAlexW4385792382MaRDI QIDQ6176299
Chungen Shen, Lei-Hong Zhang, Wei Hong Yang, Yunlong Wang
Publication date: 22 August 2023
Published in: Journal of Scientific Computing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10915-023-02302-6
duality gapactive setpolyhedral projection problemproximal gradient algorithmproximal semismooth Newton method
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Fast projection onto the simplex and the \(l_1\) ball
- An active set algorithm for nonlinear optimization with polyhedral constraints
- A linear separability criterion for sets of Euclidean space
- Gradient methods with adaptive step-sizes
- DC formulations and algorithms for sparse optimization problems
- Nonsmooth penalty and subgradient algorithms to solve the problem of projection onto a polytope
- An accelerated active-set algorithm for a quadratic semidefinite program with general constraints
- Inexact proximal memoryless quasi-Newton methods based on the Broyden family for minimizing composite functions
- Solving nonnegative sparsity-constrained optimization via DC quadratic-piecewise-linear approximations
- Linear convergence of a nonmonotone projected gradient method for multiobjective optimization
- On the acceleration of the Barzilai-Borwein method
- A brief introduction to manifold optimization
- An active-set proximal-Newton algorithm for \(\ell_1\) regularized optimization problems with box constraints
- A nonsmooth version of Newton's method
- Computational acceleration of projection algorithms for the linear best approximation problem
- Projection onto a Polyhedron that Exploits Sparsity
- Introduction to Nonsmooth Optimization
- Proximal Newton-Type Methods for Minimizing Composite Functions
- A Fast Algorithm for Sparse Reconstruction Based on Shrinkage, Subspace Optimization, and Continuation
- A Newton-CG Augmented Lagrangian Method for Semidefinite Programming
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- A New Active Set Algorithm for Box Constrained Optimization
- Two-Point Step Size Gradient Methods
- Semismooth and Semiconvex Functions in Constrained Optimization
- Cost Approximation: A Unified Framework of Descent Algorithms for Nonlinear Programs
- Alternate step gradient method*
- A Projected Gradient and Constraint Linearization Method for Nonlinear Model Predictive Control
- Sparse Reconstruction by Separable Approximation
- First-Order Methods in Optimization
- A Highly Efficient Semismooth Newton Augmented Lagrangian Method for Solving Lasso Problems
- Gap Safe screening rules for sparsity enforcing penalties
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- Fastest Mixing Markov Chain on a Graph
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- Safe Feature Elimination in Sparse Supervised Learning
- Riemannian Optimization on the Symplectic Stiefel Manifold
- Equipping the Barzilai--Borwein Method with the Two Dimensional Quadratic Termination Property
- Efficient projection onto the intersection of a half-space and a box-like set and its generalized Jacobian
- Error Bounds, Quadratic Growth, and Linear Convergence of Proximal Methods
- On Quasi-Newton Forward-Backward Splitting: Proximal Calculus and Convergence
- Convex Analysis
- Semismooth Matrix-Valued Functions
- Benchmarking optimization software with performance profiles.