Convexification with bounded gap for randomly projected quadratic optimization
From MaRDI portal
Publication:5080508
Abstract: Random projection techniques based on Johnson-Lindenstrauss lemma are used for randomly aggregating the constraints or variables of optimization problems while approximately preserving their optimal values, that leads to smaller-scale optimization problems. D'Ambrosio et al. have applied random projection to a quadratic optimization problem so as to decrease the number of decision variables. Although the problem size becomes smaller, the projected problem will also almost surely be non-convex if the original problem is non-convex, and hence will be hard to solve. In this paper, by focusing on the fact that the level of the non-convexity of a non-convex quadratic optimization problem can be alleviated by random projection, we find an approximate global optimal value of the problem by attributing it to a convex problem with smaller size. To the best of our knowledge, our paper is the first to use random projection for convexification of non-convex optimization problems. We evaluate the approximation error between optimum values of a non-convex optimization problem and its convexified randomly projected problem.
Recommendations
Cites work
- A trace inequality of John von Neumann
- An introduction to matrix concentration inequalities
- CVXPY: a Python-embedded modeling language for convex optimization
- Concentration inequalities and moment bounds for sample covariance operators
- Determinantal point processes in randomized numerical linear algebra
- Dimensionality reduction of SDPs through sketching
- Extensions of Lipschitz mappings into a Hilbert space
- Feature discovery in non-metric pairwise data
- High-dimensional probability. An introduction with applications in data science
- Learning the kernel matrix with semidefinite programming
- Random projections for linear programming
- Random projections for quadratic programs
- Randomized Algorithms for Matrices and Data
- Randomized Sketches of Convex Programs With Sharp Guarantees
- Semidefinite Programming
- Sketching as a tool for numerical linear algebra
- Sketching meets random projection in the dual: a provable recovery algorithm for big and high-dimensional data
- Support vector machine classification with indefinite kernels
- Support-vector networks
- The DC (Difference of convex functions) programming and DCA revisited with DC models of real world nonconvex optimization problems
Cited in
(3)
This page was built for publication: Convexification with bounded gap for randomly projected quadratic optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5080508)