Finite Blocklength Lossy Source Coding for Discrete Memoryless Sources
From MaRDI portal
Publication:6139507
DOI10.1561/0100000134zbMath1529.94020arXiv2301.07871MaRDI QIDQ6139507
Publication date: 19 December 2023
Published in: Foundations and Trends® in Communications and Information Theory (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2301.07871
data compressionquantizationsource codingrate-distortion theoryShannon theorymultiuser information theory
Cites Work
- Information Theory
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Mathematical Theory of Communication
- On the rate of convergence in the multivariate CLT
- Hierarchical coding of discrete sources
- Estimation of mean error for a discrete successive-approximation scheme
- The information theory approach to communications
- On the dependence of the Berry-Esseen bound on dimension
- On the divisibility of discrete sources with an additive single-letter distortion measure
- A multivariate Berry-Esseen theorem with explicit constants
- Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities
- Nonasymptotic Noisy Lossy Source Coding
- Strong Successive Refinability and Rate-Distortion-Complexity Tradeoff
- Variable-Length Compression Allowing Errors
- An Information-Spectrum Approach to Weak Variable-Length Source Coding With Side-Information
- The Third-Order Term in the Normal Approximation for the AWGN Channel
- A Case Where Interference Does Not Affect the Channel Dispersion
- Nonasymptotic and Second-Order Achievability Bounds for Coding With Side-Information
- The Dispersion of Nearest-Neighbor Decoding for Additive Non-Gaussian Channels
- Indirect and Direct Gaussian Distributed Source Coding Problems
- Moderate Deviations in Channel Coding
- Second-Order Coding Rates for Channels With State
- The Lossy Common Information of Correlated Sources
- Lossy Joint Source-Channel Coding in the Finite Blocklength Regime
- Second-Order Region for Gray–Wyner Network
- Distributed Source Coding of Correlated Gaussian Remote Sources
- Network Information Theory
- Asymptotic Properties on Codeword Lengths of an Optimal FV Code for General Sources
- Mismatched codebooks and the role of entropy coding in lossy data compression
- Multiple description coding with many channels
- Universally Attainable Error Exponents for Rate-Distortion Coding of Noisy Sources
- Second-Order Asymptotics in Fixed-Length Source Coding and Intrinsic Randomness
- Rate distortion when side information may be absent
- On multiple descriptions and team guessing
- New results in binary multiple descriptions
- On a Source-Coding Problem with Two Channels and Three Receivers
- Source Coding for Multiple Descriptions
- Limit theorems for uniform distributions on spheres in high-dimensional euclidean spaces
- Achievable rates for multiple descriptions
- Source Coding for a Simple Network
- Error exponent for source coding with a fidelity criterion
- The rate-distortion function for source coding with side information at the decoder
- Successive refinement of information: characterization of the achievable rates
- A lower bound on the expected length of one-to-one codes
- On the role of mismatch in rate distortion theory
- The redundancy of source coding with a fidelity criterion. 1. Known statistics
- Failure of successive refinement for symmetric Gaussian mixtures
- Weak variable-length source coding
- Gaussian codes and Shannon bounds for multiple descriptions
- Pointwise redundancy in lossy data compression and universal lossy data compression
- Critical behavior in lossy source coding
- A Single-Shot Approach to Lossy Source Coding Under Logarithmic Loss
- Source coding, large deviations, and approximate pattern matching
- On the rate-distortion region for multiple descriptions
- Arbitrary source models and Bayesian codebooks in rate-distortion theory
- Error exponents in scalable source coding
- Computation and analysis of the n -layer scalable rate-distortion function
- Asymptotics and Non-Asymptotics for Universal Fixed-to-Variable Source Coding
- Quantization
- The method of types [information theory]
- Lossy source coding
- On the redundancy of lossy source coding with abstract alphabets
- Rate-distortion function when side-information may be present at the decoder
- Error exponents for successive refinement by partitioning
- Nearest neighbor decoding for additive non-Gaussian noise channels
- Information Spectrum Approach to Second-Order Coding Rate in Channel Coding
- On the Redundancy of Slepian–Wolf Coding
- Nonstationary Gauss-Markov Processes: Parameter Estimation and Dispersion
- Third-Order Asymptotics of Variable-Length Compression Allowing Errors
- Information-Theoretic Foundations of Mismatched Decoding
- Lossless Source Coding in the Point-to-Point, Multiple Access, and Random Access Scenarios
- Variable-Length Source Dispersions Differ Under Maximum and Average Error Criteria
- Second Order Analysis for Joint Source-Channel Coding With General Channel and Markovian Source
- The Dispersion of the Gauss–Markov Source
- Successive Refinement of Abstract Sources
- Non-Asymptotic Converse Bounds and Refined Asymptotics for Two Source Coding Problems
- The Dispersion of Mismatched Joint Source-Channel Coding for Arbitrary Sources and Additive Channels
- Refined Asymptotics for Rate-Distortion Using Gaussian Codebooks for Arbitrary Sources
- Fixed-Length Lossy Compression in the Finite Blocklength Regime
- Discrete Lossy Gray–Wyner Revisited: Second-Order Asymptotics, Large and Moderate Deviations
- Channel Coding Rate in the Finite Blocklength Regime
- Optimal Lossless Data Compression: Non-Asymptotics and Asymptotics
- On the Dispersions of Three Network Information Theory Problems
- Second-Order and Moderate Deviation Asymptotics for Successive Refinement
- Information rates of Wiener processes
- Transmission of noisy information to a noisy receiver with minimum distortion
- Information rates of autoregressive processes
- Rate distortion functions for finite-state finite-alphabet Markov sources
- Information transmission with additional noise
- An upper bound on the entropy series
- The random coding bound is tight for the average code (Corresp.)
- Noiseless coding of correlated information sources
- Random Packings and Coverings of the Unit n-Sphere
- Successive refinement of information