A Two-Phase Gradient Method for Quadratic Programming Problems with a Single Linear Constraint and Bounds on the Variables
From MaRDI portal
Publication:4687242
DOI10.1137/17M1128538zbMath1461.65141arXiv1705.01797OpenAlexW2610492509MaRDI QIDQ4687242
Marco Viola, Jesse L. Barlow, Daniela di Serafino, Gerardo Toraldo
Publication date: 11 October 2018
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1705.01797
Related Items
Comparison of active-set and gradient projection-based algorithms for box-constrained quadratic programming, On the stationarity for nonlinear optimization problems with polyhedral constraints, Combined Newton-gradient method for constrained root-finding in chemical reaction networks, A subspace-accelerated split Bregman method for sparse data recovery with joint \(\ell_1\)-type regularizers, A two-phase method for solving continuous rank-one quadratic knapsack problems, Using gradient directions to get global convergence of Newton-type methods, Steplength selection in gradient projection methods for box-constrained quadratic programs, ACQUIRE: an inexact iteratively reweighted norm approach for TV-based Poisson image restoration, Minimization over the \(\ell_1\)-ball using an active-set non-monotone projected gradient, An active-set algorithmic framework for non-convex optimization problems over the simplex, On the convergence properties of scaled gradient projection methods with non-monotone Armijo-like line searches, Hybrid limited memory gradient projection methods for box-constrained optimization problems
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On the regularizing behavior of the SDA and SDC gradient methods in the solution of linear ill-posed problems
- Fast projection onto the simplex and the \(l_1\) ball
- An active set algorithm for nonlinear optimization with polyhedral constraints
- An efficient gradient method using the Yuan steplength
- New adaptive stepsize selections in gradient methods
- Constrained global optimization: algorithms and applications
- Algorithms for bound constrained quadratic programming problems
- Estimation of dependences based on empirical data. Transl. from the Russian by Samuel Kotz
- Projected Barzilai-Borwein methods for large-scale box-constrained quadratic programming
- Minimizing quadratic functions subject to bound constraints with the rate of convergence and finite termination
- Minimizing quadratic functions with semidefinite Hessian subject to bound constraints
- Optimal \(L_2\)-norm empirical importance weights for the change of probability measure
- On the steplength selection in gradient methods for unconstrained optimization
- New algorithms for singly linearly constrained quadratic programs subject to lower and upper bounds
- On spectral properties of steepest descent methods
- An Affine-Scaling Interior-Point Method for Continuous Knapsack Constraints with Application to Support Vector Machines
- On the Identification Property of a Projected Gradient Method
- Multipoint methods for separable nonlinear networks
- A New Active Set Algorithm for Box Constrained Optimization
- A Solver for Nonconvex Bound-Constrained Quadratic Optimization
- A scaled gradient projection method for constrained image deblurring
- Projected gradient methods for linearly constrained problems
- Quasi-Newton Updates with Bounds
- On the Solution of Large Quadratic Programming Problems with Bound Constraints
- On the Maximization of a Concave Quadratic Function with Box Constraints
- Box Constrained Quadratic Programming with Proportioning and Projections
- Nonmonotone Spectral Projected Gradient Methods on Convex Sets
- Handling nonpositive curvature in a limited memory steepest descent method
- On the numerical solution of bound constrained optimization problems
- Gradient projection methods for quadratic programs and applications in training support vector machines
- Benchmarking optimization software with performance profiles.