Analyzing Sparse Dictionaries for Online Learning With Kernels
From MaRDI portal
Publication:4580941
DOI10.1109/TSP.2015.2457396zbMATH Open1395.94119arXiv1409.6045OpenAlexW2963527399MaRDI QIDQ4580941FDOQ4580941
Publication date: 22 August 2018
Published in: IEEE Transactions on Signal Processing (Search for Journal in Brave)
Abstract: Many signal processing and machine learning methods share essentially the same linear-in-the-parameter model, with as many parameters as available samples as in kernel-based machines. Sparse approximation is essential in many disciplines, with new challenges emerging in online learning with kernels. To this end, several sparsity measures have been proposed in the literature to quantify sparse dictionaries and constructing relevant ones, the most prolific ones being the distance, the approximation, the coherence and the Babel measures. In this paper, we analyze sparse dictionaries based on these measures. By conducting an eigenvalue analysis, we show that these sparsity measures share many properties, including the linear independence condition and inducing a well-posed optimization problem. Furthermore, we prove that there exists a quasi-isometry between the parameter (i.e., dual) space and the dictionary's induced feature space.
Full work available at URL: https://arxiv.org/abs/1409.6045
Cited In (5)
This page was built for publication: Analyzing Sparse Dictionaries for Online Learning With Kernels
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4580941)