Unconditional bases and bit-level compression
From MaRDI portal
Publication:2564044
DOI10.1006/ACHA.1996.0032zbMATH Open0936.62004OpenAlexW2021586389MaRDI QIDQ2564044FDOQ2564044
Authors: David Donoho
Publication date: 18 May 2000
Published in: Applied and Computational Harmonic Analysis (Search for Journal in Brave)
Full work available at URL: https://semanticscholar.org/paper/0e048e70c56c4b98c58657505b75af5e6b6f9296
Recommendations
- Unconditional bases are optimal bases for data compression and for statistical estimation
- Universal Compression of Memoryless Sources Over Unknown Alphabets
- Compression and hadamard power inequalities
- Inequalities and algorithms for universal data compression
- Compressibility and uniform complexity
- scientific article; zbMATH DE number 78498
- Universal almost sure data compression
- scientific article; zbMATH DE number 3463524
- Universal Algorithms for Channel Decoding of Uncompressed Sources
Statistical aspects of information-theoretic topics (62B10) Source coding (94A29) General harmonic expansions, frames (42C15)
Cited In (24)
- Thresholding algorithms, maxisets and well-concentrated bases
- On the entropy numbers between the anisotropic spaces and the spaces of functions with mixed smoothness
- Tree approximation with anisotropic decompositions
- Wedgelets: Nearly minimax estimation of edges
- Thresholding procedure with priors based on Pareto distributions
- FUNCTIONAL APPROXIMATION IN MULTISCALE COMPLEX SYSTEMS
- Metric entropy limits on recurrent neural network learning of linear dynamical systems
- Dimensionality reduction and greedy learning of convoluted stochastic dynamics
- Efficient hedging of options with probabilistic Haar wavelets
- Nonlinear estimation over weak Besov spaces and minimax Bayes
- Greedy bases are best for \(m\)-term approximation
- Entropy numbers of functions on \([-1,1]\) with Jacobi weights
- Bayesian maximum entropy based algorithm for digital X-ray mammogram processing
- On the minimax optimality and superiority of deep neural network learning over sparse parameter spaces
- Estimates for entropy numbers of sets of smooth functions on the torus \(\mathbb{T}^d\)
- On the representation of smooth functions on the sphere using finitely many bits
- Information-theoretic determination of minimax rates of convergence
- Unconditional convergence of Fourier expansions in systems of product bases in Orlicz spaces
- Tree approximation and optimal encoding
- Optimally sparse data representations
- Replicant compression coding in Besov spaces
- New tight frames of curvelets and optimal representations of objects with piecewise C2 singularities
- Unconditional bases are optimal bases for data compression and for statistical estimation
- Entropy numbers of Besov classes of generalized smoothness on the sphere
This page was built for publication: Unconditional bases and bit-level compression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2564044)