Achievable complexity-performance tradeoffs in lossy compression
From MaRDI portal
Publication:1945149
DOI10.1134/S0032946012040060zbMath1312.94023OpenAlexW2083755992MaRDI QIDQ1945149
Sergio Verdú, Ankit Gupta, Tsachy Weissman
Publication date: 3 April 2013
Published in: Problems of Information Transmission (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1134/s0032946012040060
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Complexity-compression tradeoffs in lossy compression via efficient random codebooks and databases
- Linear-time encodable and decodable error-correcting codes
- A coding theorem for lossy data compression by LDPC codes
- Complexity Versus Performance of Capacity-Achieving Irregular Repeat–Accumulate Codes on the Binary Erasure Channel
- Accumulate–Repeat–Accumulate Codes: Capacity-Achieving Ensembles of Systematic Codes for the Erasure Channel With Bounded Complexity
- Error exponent for source coding with a fidelity criterion
- The redundancy of source coding with a fidelity criterion. 1. Known statistics
- An implementable lossy version of the Lempel-Ziv algorithm. I. Optimality for memoryless sources
- Efficient erasure correcting codes
- On the role of pattern matching in information theory
- Simple universal lossy data compression schemes derived from the Lempel-Ziv algorithm
- Nonlinear Sparse-Graph Codes for Lossy Compression
- Channel Polarization: A Method for Constructing Capacity-Achieving Codes for Symmetric Binary-Input Memoryless Channels
- Fixed-Length Lossy Compression in the Finite Blocklength Regime
- Lossy Compression of Discrete Sources via the Viterbi Algorithm
- Polar Codes are Optimal for Lossy Source Coding
- Probability Inequalities for Sums of Bounded Random Variables
- Elements of Information Theory
This page was built for publication: Achievable complexity-performance tradeoffs in lossy compression