Fast rates for empirical vector quantization
From MaRDI portal
Publication:351685
DOI10.1214/13-EJS822zbMath1349.62038arXiv1201.6052MaRDI QIDQ351685
Publication date: 9 July 2013
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1201.6052
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Approximations to statistical distributions (nonasymptotic) (62E17)
Related Items (9)
Compressive statistical learning with random feature moments ⋮ Statistical learning guarantees for compressive clustering and compressive mixture modeling ⋮ Empirical risk minimization for heavy-tailed losses ⋮ Also for \(k\)-means: more data does not imply better performance ⋮ Dimensionality-Dependent Generalization Bounds for k-Dimensional Coding Schemes ⋮ Bandwidth selection in kernel empirical risk minimization via the gradient ⋮ Robust \(k\)-means clustering for distributions with two moments ⋮ On strong consistency of kernel \(k\)-means: a Rademacher complexity approach ⋮ Nonasymptotic bounds for vector quantization in Hilbert spaces
Cites Work
- Unnamed Item
- Unnamed Item
- Some limit theorems for empirical processes (with discussion)
- Risk bounds for statistical learning
- Concentration inequalities and model selection. Ecole d'Eté de Probabilités de Saint-Flour XXXIII -- 2003.
- Quantization and clustering with Bregman divergences
- Strong consistency of k-means clustering
- A central limit theorem for k-means clustering
- A Bennett concentration inequality and its application to suprema of empirical processes
- Principal points and self-consistent points of symmetric multivariate distributions
- On Hölder fields clustering
- Foundations of quantization for probability distributions
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Statistical performance of support vector machines
- Improved Minimax Bounds on the Test and Training Distortion of Empirically Designed Vector Quantizers
- Individual Convergence Rates in Empirical Vector Quantizer Design
- On the Performance of Clustering in Hilbert Spaces
- Quantization and the method of<tex>k</tex>-means
- Integrals on a moving manifold and geometrical probability
- On the amount of statistical side information required for lossy data compression
- The minimax distortion redundancy in empirical quantizer design
- Rates of convergence in the source coding theorem, in empirical quantizer design, and in universal lossy source coding
This page was built for publication: Fast rates for empirical vector quantization