Hanson-Wright inequality in Hilbert spaces with application to \(K\)-means clustering for non-Euclidean data
From MaRDI portal
Publication:2214261
DOI10.3150/20-BEJ1251zbMath1475.60044arXiv1810.11180WikidataQ114038756 ScholiaQ114038756MaRDI QIDQ2214261
Publication date: 7 December 2020
Published in: Bernoulli (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1810.11180
Gaussian processes (60G15) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Inequalities; stochastic orderings (60E15) Probability theory on linear topological spaces (60B11)
Related Items
Diffusion \(K\)-means clustering on manifolds: provable exact recovery via semidefinite relaxations ⋮ Correlation Tensor Decomposition and Its Application in Spatial Imaging Data ⋮ Optimally tackling covariate shift in RKHS-based nonparametric regression ⋮ Community detection with dependent connectivity ⋮ Partial recovery bounds for clustering with the relaxed \(K\)-means ⋮ An \({\ell_p}\) theory of PCA and spectral clustering
Cites Work
- Unnamed Item
- A mathematical introduction to compressive sensing
- Transference principles for log-Sobolev and spectral-gap with applications to conservative spin systems
- A tail inequality for quadratic forms of subgaussian random vectors
- Hanson-Wright inequality and sub-Gaussian concentration
- A note on the Hanson-Wright inequality for random vectors with dependencies
- SubGaussian random variables in Hilbert spaces
- Gaussian and bootstrap approximations for high-dimensional U-statistics and their applications
- Clustering functional data
- Adaptive estimation of a quadratic functional by model selection.
- A bound on tail probabilities for quadratic forms in independent random variables whose distributions are not necessarily symmetric
- When do birds of a feather flock together? \(k\)-means, proximity, and conic programming
- Partial recovery bounds for clustering with the relaxed \(K\)-means
- A survey of kernel and spectral methods for clustering
- Nonparametric functional data analysis. Theory and practice.
- Random weighted projections, random quadratic forms and random eigenvectors
- Pushing the Limits of Contemporary Statistics: Contributions in Honor of Jayanta K. Ghosh
- High-Dimensional Statistics
- High-Dimensional Probability
- Least squares quantization in PCM
- Theoretical Foundations of Functional Data Analysis, with an Introduction to Linear Operators
- Advanced Lectures on Machine Learning
- Approximating K‐means‐type Clustering via Semidefinite Programming
- A Bound on Tail Probabilities for Quadratic Forms in Independent Random Variables
- Theory of Reproducing Kernels