scientific article; zbMATH DE number 4116450
From MaRDI portal
Publication:4731122
zbMath0681.94001MaRDI QIDQ4731122
Publication date: 1987
Title: zbMATH Open Web Interface contents unavailable due to conflicting licenses.
sourcesGaussian channelsdata compression codescontinuous channelsEstimation Theoryblock codes for data transmissiondata compaction codes for discrete sourcesdata translation codes for discrete constrained channelsdata transmission codesHypotheses Testingmultiterminal information networks
Information theory (general) (94A15) Theory of error-correcting codes and error-detecting codes (94B99) Communication theory (94A05) Introductory exposition (textbooks, tutorial papers, etc.) pertaining to information and communication theory (94-01)
Related Items
Jeffreys' prior is asymptotically least favorable under entropy risk, Distributed nonparametric function estimation: optimal rate of convergence and cost of adaptation, Bounds on Data Compression Ratio with a Given Tolerable Error Probability, Neural coding of categories: information efficiency and optimal population codes, Strong decomposition of random variables, Linking information reconciliation and privacy amplification, Parsimonious reduction of Gaussian mixture models with a variational-Bayes approach, A class of random polynomials with an invariance property, Statistical Problem Classes and Their Links to Information Theory, An information-theoretic model for steganography, An ergodic theorem for constrained sequences of functions, Relative submajorization and its use in quantum resource theories, Curvature entropy for curved profile generation, Comparison, utility, and partition of dependence under absolutely continuous and singular distributions, Approximate Entropy as an Irregularity Measure for Financial Data, Sub- and superadditivity à la Carlen of matrices related to the Fisher information, The sphere packing bound for memoryless channels, The proper formula for relative entropy and its asymptotics in quantum probability, Some properties of generalized exponential entropies with applications to data compression, A universal statistical test for random bit generators, On data complexity of distinguishing attacks versus message recovery attacks on stream ciphers, Quantifying Neurotransmission Reliability Through Metrics-Based Information Analysis, Multiple Objects: Error Exponents in Hypotheses Testing and Identification, REVIEW OF CHAOS COMMUNICATION BY FEEDBACK CONTROL OF SYMBOLIC DYNAMICS, The attractor-basin portrait of a cellular automaton, A discrete version of the Stam inequality and a characterization of the Poisson distribution, Information measures for global geopotential models, Information-theoretic comparisons of cellular multiple-access systems with bandwidth-dependent fading considerations, An information-theoretic fuzzy C-spherical shells clustering algorithm, Quantifying predictability through information theory: small sample estimation in a non-Gaussian framework, A minimally informative likelihood for decision analysis: Illustration and robustness, Of bits and wows: a Bayesian theory of surprise with applications to attention, Optimal Estimation via Nonanticipative Rate Distortion Function and Applications to Time-Varying Gauss--Markov Processes, Quantifying decoherence of Gaussian noise channels, On convergence properties of Shannon entropy, Unnamed Item, On the entropy devil's staircase in a family of gap-tent maps, Effective entropies and data compression, Optimal data compression algorithm, Universal Data Compression Algorithm Based on Approximate String Matching, Interactive unsupervised classification and visualization for browsing an image collection, Performance robustness of a noise-assisted transmission line, Stationary tail probabilities in exponential server tandems with renewal arrivals, Mutual information and self-control of a fully-connected low-activity neural network, Devil-staircase behavior of dynamical invariants in chaotic scattering, Properties of noncommutative Rényi and Augustin information