Entropy, universal coding, approximation, and bases properties (Q1879237): Difference between revisions

From MaRDI portal
Importer (talk | contribs)
Created a new Item
 
Import241208061232 (talk | contribs)
Normalize DOI.
 
(5 intermediate revisions by 4 users not shown)
Property / DOI
 
Property / DOI: 10.1007/s00365-003-0556-z / rank
Normal rank
 
Property / reviewed by
 
Property / reviewed by: Yu. I. Makovoz / rank
Normal rank
 
Property / reviewed by
 
Property / reviewed by: Yu. I. Makovoz / rank
 
Normal rank
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1007/s00365-003-0556-z / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W1980161306 / rank
 
Normal rank
Property / DOI
 
Property / DOI: 10.1007/S00365-003-0556-Z / rank
 
Normal rank
links / mardi / namelinks / mardi / name
 

Latest revision as of 11:09, 16 December 2024

scientific article
Language Label Description Also known as
English
Entropy, universal coding, approximation, and bases properties
scientific article

    Statements

    Entropy, universal coding, approximation, and bases properties (English)
    0 references
    0 references
    0 references
    22 September 2004
    0 references
    Let \({\mathcal E}=\{e_n\}_{n=0}^\infty\) be a Schauder basis of a Banach space \(X\). For \(f=\sum\theta_ie_i \in X\) the linear approximation error is defined by \[ \rho_n(f,{\mathcal E},X)=\left \| \sum_{i=n}^\infty \theta_ie_i \right \| _X , \] and the nonlinear, \(n\)-term, approximation error by \[ \sigma_n(f,{\mathcal E},X)=\inf_h \{\| f-h\| : h=\sum_{i \in \Lambda} \theta_ie_i, \; | \Lambda | \leq n \}. \] For \(0<r,s <\infty\), the space \(B_r^s=B_r^s({\mathcal E},X)\) is defined as the set of all \(f \in X\) for which \[ \left \{ \sum_{n=1}^\infty [n^s\rho_{n-1}(f,{\mathcal E},X)]^r1/n \right \}^{1/r}:=\| f\| _{B_r^s({\mathcal E},X)}<\infty , \] with the usual modification for \(r=\infty\). The space \(A_r^s=A_r^s({\mathcal E},X)\) is defined similarly but with \(\sigma_n(f,{\mathcal E},X)\) instead of \(\rho_n(f,{\mathcal E},X)\). The authors establish some metric entropy results assuming that the normalized basis \({\mathcal E}\) is quasi-greedy and verifies Temlyakov's \(p\)-property for some \(p>0\). They prove that the \(\varepsilon\)-entropy in \(X\) of the unit ball \(U_r^s\) of the space \(B_r^s\) is \(\asymp \varepsilon ^{-1/s}\) while for \(0<\delta<s\) the \(\varepsilon\)-entropy of the intersection of \(U_r^\delta\) with the unit ball \(V_r^s\) of the spase \(A_r^s\) is \(\asymp \varepsilon ^{-1/s}\log(1/\varepsilon).\) Thus the metric entropy distinguishes between linear and nonlinear approximation. The lower bounds for \(\varepsilon\)-entropy are obtained using estimates for large deviations in binomial and hypergeometric ditributions. The upper bounds are based on some special universal coding. Extending and refining Donoho's results for Hilbert spaces, the authors prove that under certain conditions spaces \(A_\infty^r\) are characterized by the metric entropy behavior. The last section of the paper concerns applications to wavelets in \(L_p\).
    0 references
    entropy
    0 references
    coding
    0 references
    \(m\)-term approximation
    0 references
    greedy bases
    0 references

    Identifiers