Reconstruction and subgaussian operators in asymptotic geometric analysis (Q2475582)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Reconstruction and subgaussian operators in asymptotic geometric analysis
scientific article

    Statements

    Reconstruction and subgaussian operators in asymptotic geometric analysis (English)
    0 references
    0 references
    0 references
    11 March 2008
    0 references
    The paper deals with reconstruction problems in \(\mathbb{R}^n\). The first (main) part of the paper is devoted to approximate reconstruction. The goal is to approximate an unknown vector \(v\in T\), where \(T\) is a set in \(\mathbb{R}^n\). The data one is given is the set \(T\) and \(k<n\) (usually, \(k\) is much smaller than \(n\)) linear measurements, that is, \(k\) inner products \((X_i, v)\), \(i\leq k\), where the \(X_i\)'s are i.i.d.\ isotropic sub-Gaussian random vectors in \(\mathbb{R}^n\). The authors prove that, if \(T\) is a centrally symmetric convex body, \(y\in T\), and \(\{(X_i, y)\}_i\) is close to \(\{(X_i, v)\}_i\) then \(y\) is close to \(v\) with high probability. They provide a quantitative estimate for the Euclidean norm of \((y-v)\) in terms of a well-known geometric parameter of \(T\), namely, \(\ell_*(T)=\mathbb{E}\|\sum_ig_ie_i\|_T\), where the \(g_i\)'s are independent Gaussian \({\mathcal{N}(0,1)}\) random variables and \(\{e_i\}_i\) is the canonical basis of \(\mathbb{R}^n\). This result extends previous results of \textit{D.\,L.\thinspace Donoho} [IEEE Trans.\ Inf.\ Theory 52, 1289--1306 (2006; DOI 10.1109/TIT.2006.871582)] and of \textit{E.\,Candès} and \textit{T.\,Tao} [IEEE Trans.\ Inf.\ Theory 52, No.\,12, 5406--5425 (2006; DOI 10.1109/TIT.2006.885507)] to the case of general \(T\) and to sub-Gaussian distributions. The second part of the paper is devoted to exact reconstruction of vectors with small support, a problem which stems from error correcting codes. As was shown by \textit{S.\,S.\thinspace Chen, D.\,L.\thinspace Donoho} and \textit{M.\,A.\thinspace Saunders} [SIAM J.~Sci.\ Comput.\ 20, No.\,1, 33--61 (1999; Zbl 0919.94002)], in order to identify the noise vector, the following minimization problem can be used: minimize \(\|v\|_1\) under the constraint \(\Gamma v =\Gamma z\), where \(\Gamma \) is a \(k\) by \(n\) (coding) matrix, and \(z\in \mathbb{R}^n\) is an input vector. \textit{E.\,Candès} and \textit{T.\,Tao} [IEEE Trans.\ Inf.\ Theory 51, No.\,12, 4203--4215 (2005; DOI 10.1109/TIT.2005.858979)] and \textit{M.\,Rudelson} and \textit{R.\,Vershynin} [Int.\ Math.\ Res.\ Not.\ 2005, No.\,64, 4019--4041 (2005; Zbl 1103.94014)] proved that for a Gaussian matrix \(\Gamma\) the minimization problem above has a unique solution, which is \(z\) (under some conditions on \(k\), \(n\), and on the support of \(z\)). In the present paper, this result is extended to matrices with rows sampled independently according to an isotropic sub-Gaussian measure. In particular, this yields new information about \(\pm 1\) random polytopes, i.e., polytopes with vertices chosen randomly from the vertices of the cube. More precisely, the authors prove that a random \(k\)-dimensional \(\pm 1\) polytope with \(n\) vertices is an \(m\)-neighborly polytope if \(m\leq c k / \ln (Cn/k)\).
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    approximate reconstruction
    0 references
    empirical processes
    0 references
    exact reconstruction
    0 references
    linear measurements
    0 references
    neighborly polytopes
    0 references
    \(\pm 1\) random polytopes
    0 references
    sub-Gaussian operators
    0 references
    sub-Gaussian random vectors
    0 references
    asymptotic geometric analysis
    0 references
    0 references