Improved concentration bounds for count-sketch
From MaRDI portal
Publication:5384012
DOI10.1137/1.9781611973402.51zbMATH Open1422.68052OpenAlexW4205108489MaRDI QIDQ5384012FDOQ5384012
Authors: Gregory Minton, Eric Price
Publication date: 20 June 2019
Published in: Proceedings of the Twenty-Fifth Annual ACM-SIAM Symposium on Discrete Algorithms (Search for Journal in Brave)
Abstract: We present a refined analysis of the classic Count-Sketch streaming heavy hitters algorithm [CCF02]. Count-Sketch uses O(k log n) linear measurements of a vector x in R^n to give an estimate x' of x. The standard analysis shows that this estimate x' satisfies ||x'-x||_infty^2 < ||x_tail||_2^2 / k, where x_tail is the vector containing all but the largest k coordinates of x. Our main result is that most of the coordinates of x' have substantially less error than this upper bound; namely, for any c < O(log n), we show that each coordinate i satisfies (x'_i - x_i)^2 < (c/log n) ||x_tail||_2^2/k with probability 1-2^{-Omega(c)}, as long as the hash functions are fully independent. This subsumes the previous bound and is optimal for all c. Using these improved point estimates, we prove a stronger concentration result for set estimates by first analyzing the covariance matrix and then using a median-of-median-of-medians argument to bootstrap the failure probability bounds. These results also give improved results for l_2 recovery of exactly k-sparse estimates x^* when x is drawn from a distribution with suitable decay, such as a power law or lognormal. We complement our results with simulations of Count-Sketch on a power law distribution. The empirical evidence indicates that our theoretical bounds give a precise characterization of the algorithm's performance: the asymptotics are correct and the associated constants are small. Our proof shows that any symmetric random variable with finite variance and positive Fourier transform concentrates around 0 at least as well as a Gaussian. This result, which may be of independent interest, gives good concentration even when the noise does not converge to a Gaussian.
Full work available at URL: https://arxiv.org/abs/1207.5200
Recommendations
- Beating CountSketch for heavy hitters in insertion streams
- On low-risk heavy hitters and sparse recovery schemes
- High probability frequency moment sketches
- On deterministic sketching and streaming for sparse recovery and norm estimation
- On deterministic sketching and streaming for sparse recovery and norm estimation
Probability in computer science (algorithm analysis, random structures, phase transitions, etc.) (68Q87) Analysis of algorithms (68W40) Data structures (68P05)
Cited In (9)
- Sparsifying count sketch
- Efficient sketches for the set query problem
- On deterministic sketching and streaming for sparse recovery and norm estimation
- Perfect \(L_p\) sampling in a data stream
- LATIN 2004: Theoretical Informatics
- Unbiased estimation of inner product via higher order count sketch
- Optimizing the confidence bound of count-min sketches to estimate the streaming big data query results more precisely
- On low-risk heavy hitters and sparse recovery schemes
- Count-min sketch with variable number of hash functions: an experimental study
This page was built for publication: Improved concentration bounds for count-sketch
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5384012)