AN EXPONENTIAL LOWER BOUND ON THE COMPLEXITY OF REGULARIZATION PATHS

From MaRDI portal
Publication:2968091

DOI10.20382/JOCG.V3I1A9zbMATH Open1404.68103arXiv0903.4817OpenAlexW2963576992MaRDI QIDQ2968091FDOQ2968091


Authors: Martin Jaggi, Clément Maria, B. Gärtner Edit this on Wikidata


Publication date: 9 March 2017

Abstract: For a variety of regularized optimization problems in machine learning, algorithms computing the entire solution path have been developed recently. Most of these methods are quadratic programs that are parameterized by a single parameter, as for example the Support Vector Machine (SVM). Solution path algorithms do not only compute the solution for one particular value of the regularization parameter but the entire path of solutions, making the selection of an optimal parameter much easier. It has been assumed that these piecewise linear solution paths have only linear complexity, i.e. linearly many bends. We prove that for the support vector machine this complexity can be exponential in the number of training points in the worst case. More strongly, we construct a single instance of n input points in d dimensions for an SVM such that at least Theta(2^{n/2}) = Theta(2^d) many distinct subsets of support vectors occur as the regularization parameter changes.


Full work available at URL: https://arxiv.org/abs/0903.4817




Recommendations




Cited In (4)





This page was built for publication: AN EXPONENTIAL LOWER BOUND ON THE COMPLEXITY OF REGULARIZATION PATHS

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2968091)