Lower estimates for the singular values of random matrices (Q817897)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Lower estimates for the singular values of random matrices |
scientific article |
Statements
Lower estimates for the singular values of random matrices (English)
0 references
20 March 2006
0 references
A random value \(\beta\) is called subgaussian if \(\mathbf P(|\beta|>t)<a\;\exp(- bt)\) for some \(a,b>0\) and any \(t>0.\) Let \(A_n\) be an \(N\times n\) matrix whose entries are independent identically distributed subgaussian random values with mean 0. It is known that as \(n\to\infty\) and \(\delta=(N-n)/n>0\) is fixed, the smallest and largest singular values \(s_1\) and \(s_2\) of \(A_n\) converge to \(n/N\) almost surely. Later such convergence was proved also for diminishing \(\delta\). The author states the convergence of \(s_1\) and \(s_2\) almost surely for any positive \(\delta<1.\) He shows then that with high probability in the space \(E=A_n\mathbb R^n\), the norms \(l_2^N\) and \(l_1^n\) are equivalent on \(E\). As a corollary the author proves a strengthened version of \textit{B. S. Kashin}'s theorem [Izv. Akad. Nauk SSSR, Ser. Mat. 41, 334--351 (1977; Zbl 0354.46021)] on sections of the standard octaedron and obtains a polynomial estimate for the section diameter.
0 references
subgaussian random values
0 references
convergence
0 references