Approximation Bounds for Sparse Programs
From MaRDI portal
Publication:5073726
Abstract: We show that sparsity constrained optimization problems over low dimensional spaces tend to have a small duality gap. We use the Shapley-Folkman theorem to derive both data-driven bounds on the duality gap, and an efficient primalization procedure to recover feasible points satisfying these bounds. These error bounds are proportional to the rate of growth of the objective with the target cardinality, which means in particular that the relaxation is nearly tight as soon as the target cardinality is large enough so that only uninformative features are added.
Recommendations
- Approximability of Sparse Integer Programs
- Approximability of sparse integer programs
- Approximation hardness for a class of sparse optimization problems
- Sparse Approximate Solutions to Semidefinite Programs
- Sparse approximation by greedy algorithms
- Sparsification and subexponential approximation
- Computing sparse approximations deterministically
- A note on the hardness of sparse approximation
- On the Complexity of Solving Sparse Symmetric Linear Programs Specified with Approximate Data
- On approximating (sparse) covering integer programs
Cites work
- scientific article; zbMATH DE number 439380 (Why is no real title available?)
- scientific article; zbMATH DE number 2107836 (Why is no real title available?)
- A continuous exact \(\ell_0\) penalty (CEL0) for least squares regularized problem
- A semidefinite programming method for integer convex quadratic minimization
- Bounding duality gap for separable problems with linear constraints
- Decoding by Linear Programming
- Estimates of the Duality Gap in Nonconvex Optimization
- On general minimax theorems
- On the convexification of constrained quadratic optimization problems with indicator variables
- Optimal cardinality constrained portfolio selection
- Perspective cuts for a class of convex 0-1 mixed integer programs
- Quasi-Equilibria in Markets with Non-Convex Preferences
- Scikit-learn: machine learning in Python
- Sparse Approximate Solutions to Linear Systems
- Sparse learning via Boolean relaxations
- Why Are Big Data Matrices Approximately Low Rank?
Cited in
(10)- Sparse semidefinite programs with guaranteed near-linear time complexity via dualized clique tree conversion
- Exact Sparse Approximation Problems via Mixed-Integer Programming: Formulations and Computational Performance
- Consistency bounds and support recovery of d-stationary solutions of sparse sample average approximations
- Parameterized complexity of sparse linear complementarity problems
- Design Strategies for ARX with Provable Bounds: Sparx and LAX
- Stable Bounds on the Duality Gap of Separable Nonconvex Optimization Problems
- Phase transitions for greedy sparse approximation algorithms
- Lagrangian duality and saddle points for sparse linear programming
- Sparse learning via Boolean relaxations
- \(\ell_1\)-sparsity approximation bounds for packing integer programs
This page was built for publication: Approximation Bounds for Sparse Programs
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5073726)