Convexification with bounded gap for randomly projected quadratic optimization
From MaRDI portal
Publication:5080508
DOI10.1137/21M1433678zbMATH Open1493.90122arXiv2107.05272OpenAlexW4298096160MaRDI QIDQ5080508FDOQ5080508
Authors: Terunari Fuji, Pierre-Louis Poirion, Akiko Takeda
Publication date: 31 May 2022
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Abstract: Random projection techniques based on Johnson-Lindenstrauss lemma are used for randomly aggregating the constraints or variables of optimization problems while approximately preserving their optimal values, that leads to smaller-scale optimization problems. D'Ambrosio et al. have applied random projection to a quadratic optimization problem so as to decrease the number of decision variables. Although the problem size becomes smaller, the projected problem will also almost surely be non-convex if the original problem is non-convex, and hence will be hard to solve. In this paper, by focusing on the fact that the level of the non-convexity of a non-convex quadratic optimization problem can be alleviated by random projection, we find an approximate global optimal value of the problem by attributing it to a convex problem with smaller size. To the best of our knowledge, our paper is the first to use random projection for convexification of non-convex optimization problems. We evaluate the approximation error between optimum values of a non-convex optimization problem and its convexified randomly projected problem.
Full work available at URL: https://arxiv.org/abs/2107.05272
Recommendations
Cites Work
- CVXPY: a Python-embedded modeling language for convex optimization
- Support-vector networks
- Extensions of Lipschitz mappings into a Hilbert space
- High-dimensional probability. An introduction with applications in data science
- Semidefinite Programming
- An introduction to matrix concentration inequalities
- Randomized Algorithms for Matrices and Data
- Feature discovery in non-metric pairwise data
- Learning the kernel matrix with semidefinite programming
- The DC (Difference of convex functions) programming and DCA revisited with DC models of real world nonconvex optimization problems
- A trace inequality of John von Neumann
- Support vector machine classification with indefinite kernels
- Randomized Sketches of Convex Programs With Sharp Guarantees
- Concentration inequalities and moment bounds for sample covariance operators
- Random projections for quadratic programs
- Random projections for linear programming
- Sketching as a tool for numerical linear algebra
- Sketching meets random projection in the dual: a provable recovery algorithm for big and high-dimensional data
- Dimensionality reduction of SDPs through sketching
- Determinantal point processes in randomized numerical linear algebra
Cited In (3)
Uses Software
This page was built for publication: Convexification with bounded gap for randomly projected quadratic optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5080508)