The space complexity of inner product filters

From MaRDI portal
Publication:6325866

DOI10.4230/LIPICS.ICDT.2020.22arXiv1909.10766MaRDI QIDQ6325866FDOQ6325866


Authors: Rasmus Pagh, Johan Sivertsen Edit this on Wikidata


Publication date: 24 September 2019

Abstract: Motivated by the problem of filtering candidate pairs in inner product similarity joins we study the following inner product estimation problem: Given parameters , and unit vectors consider the task of distinguishing between the cases and langlex,yanglegeqalpha where langlex,yangle=sumi=1dxiyi is the inner product of vectors x and y. The goal is to distinguish these cases based on information on each vector encoded independently in a bit string of the shortest length possible. In contrast to much work on compressing vectors using randomized dimensionality reduction, we seek to solve the problem deterministically, with no probability of error. Inner product estimation can be solved in general via estimating langlex,yangle with an additive error bounded by . We show that bits of information about each vector is necessary and sufficient. Our upper bound is constructive and improves a known upper bound of dlog2(1/varepsilon)+O(d) by up to a factor of 2 when is close to 1. The lower bound holds even in a stronger model where one of the vectors is known exactly, and an arbitrary estimation function is allowed.













This page was built for publication: The space complexity of inner product filters

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6325866)