A Mathematical Theory of Communication

From MaRDI portal
Publication:88168


DOI10.1002/j.1538-7305.1948.tb01338.xzbMath1154.94303WikidataQ724029 ScholiaQ724029MaRDI QIDQ88168

Claude E. Shannon, Claude Shannon

Publication date: July 1948

Published in: Bell System Technical Journal (Search for Journal in Brave)

Full work available at URL: http://hdl.handle.net/11858/00-001M-0000-002C-4314-2


94-02: Research exposition (monographs, survey articles) pertaining to information and communication theory

94A17: Measures of information, entropy

94A15: Information theory (general)

94A05: Communication theory

94A24: Coding theorems (Shannon theory)

94Axx: Communication, information


Related Items

SpatEntropy, The Jensen-Shannon divergence, Relative difference in diversity between populations, On quantification of different facets of uncertainty, Some results on ordering of survival functions through uncertainty, Information theory as a unifying statistical approach for use in marketing research, Effect of class-interval size on entropy, Complexity of functions: Some questions, conjectures, and results, An information theory of image gathering, Certainty equivalents and information measures: Duality and extremal principles, Fuzzy set and probabilistic techniques for health-risk analysis, Local randomness in pseudorandom sequences, On lags and chaos in economic dynamic models, An iterative method for computing the performance of discrete memoryless communication channels, Organization by rules in finite sequences, \((R,S)\)-information radius of type \(t\) and comparison of experiments, Approximating queue lengths in \(M(t)/G/1\) queue using the maximum entropy principle, Minimum cross-entropy analysis with entropy-type constraints, Higher order fuzzy entropy and hybrid entropy of a set, Approximate string-matching with \(q\)-grams and maximal matches, Data compression with factor automata, A survey of multiple sequence comparison methods, Artificial neural network classification of \(Drosophila\) courtship song mutants, Measures of uncertainty and information in computation, Some properties of the exponential entropy, Application of rough set theory for clinical data analysis: A case study, A universal statistical test for random bit generators, Cost-performance tradeoffs for interconnection networks, Making inflexible investment decisions with incomplete information, Pattern associativity and the retrieval of semantic networks, Algebraic properties of cryptosystem PGM, A multiple sequence comparison method, A generalized model for the analysis of association in ordinal contingency tables, Analysis of mutual information measures in cluster sampling, Relative entropy under mappings by stochastic matrices, Information theoretic analysis of action potential trains. I: Analysis of correlation between two neurons, Optimal scaling of two-level factorial experiments, At the dawn of the theory of codes, Complexity measures for concurrent programs based on information- theoretic metrics, \((h,\Psi)\)-entropy differential metric, A general class of entropy statistics, Uniform bounds for sampling expansions., On measures of information energy, Dirac's representation theory as a framework for signal theory. I: Discrete finite signals, Dirac's representation theory as a framework for signal theory. II: Infinite duration and continuous signals, Optimal representation in average using Kolmogorov complexity, Information and impossibilities, Generating functions of circular codes, An application of a general sampling theorem, A new method for estimating model parameters for multinomial data, Renyi's entropy as an index of diversity in simple-stage cluster sampling, Ordering univariate distributions by entropy and variance, On the diversity of equity markets, Ambiguity resistant polynomial matrices, Gaussian clustering method based on maximum-fuzzy-entropy interpretation, Characterizing linear size circuits in terms of privacy, Range of validity of \(\alpha\) and \(\beta\) for a generalized diversity index \(H(\alpha,\beta)\) due to Good, On the size of shares for secret sharing schemes, The generalized entropy measure to the design and comparison of regression experiment in a Bayesian context, Some statistical applications of generalized Jensen difference divergence measures for fuzzy information systems, On probability-possibility transformations, Sum form equations of multiplicative type, Entropy maximisation and queueing network models, A theory of measurement in diagnosis from first principles, On some entropy methods in data analysis, Kolmogorov complexity arguments in combinatorics, Asymptotic properties of divergence statistics in a stratified random sampling and its applications to test statistical hypotheses, Fuzzy sets with inner interdependence, Entropy of natural languages: Theory and experiment, Generalized divergence measures and the probability of error, Mutual information and global strange attractors in Taylor-Couette flow, On the channel capacity of read/write isolated memory, Order statistics and \((r,s)\)-entropy measures, A limit theorem for the entropy density of nonhomogeneous Markov information source, A maximum entropy approach to estimation and inference in dynamic models or Counting fish in the sea using maximum entropy, Information and probabilistic reasoning, Statistical equilibrium in one-step forward looking economic models, Divergence statistics based on entropy functions and stratified sampling, The uncertainty principle: A mathematical survey, Estimation and inference with censored and ordered multinomial response data, The knowledge content of statistical data, Characterizations of sum form information measures on open domains, Stochastic optimization algorithms of a Bayesian design criterion for Bayesian parameter estimation of nonlinear regression models: Application in pharmacokinetics, Parameter estimation for 2-parameter generalized Pareto distribution by POME, The discovery of algorithmic probability, An application of lifting theory to optical communiction processes, Testing whether lifetime distribution is decreasing uncertainty, A test for population collinearity. A Kullback-Leibler information approach, SPIRIT and Léa Sombé: A study in probabilistic reasoning, Qualitative multicriteria methods for fuzzy evaluation problems: An illustration of economic-ecological evaluation, Relative entropy in sequential decision problems, Sensitivity of ridge-type estimation methods to condition number, Entropic integrals of orthogonal hypergeometric polynomials with general supports, The characterization of a measure of information discrepancy, Stochastic analog networks and computational complexity, Residual entropy and its characterizations in terms of hazard function and mean residual life function., An essay on the history of inequalities, Symmetry of information and one-way functions, Sequential sampling designs for the two-parameter item response theory model, A universal model of single-unit sensory receptor action, Strong shift equivalence theory and the shift equivalence problem, Strong asymptotics of Laguerre polynomials and information entropies of two-dimensional harmonic oscillator and one-dimensional Coulomb potentials, Bounds on Data Compression Ratio with a Given Tolerable Error Probability, Entropy estimators‐improvements and comparisons, Data processing using information theory functionals, Data processing using information theory functionals, On the Fisher Information, AN UNCERTAINTY MEASURE WITH MONOTONICITY UNDER THE RANDOM SET INCLUSION, Information and evidence in logic systems, MEASURING TOTAL UNCERTAINTY IN DEMPSTER-SHAFER THEORY: A NOVEL APPROACH, Maximum entropy algorithm for spectral estimation problem with aipriori information, Spatial entropy of central potentials and strong asymptotics of orthogonal polynomials, On testing hypotheses with divergence statistics, Interleaved concatenated codes: New perspectives on approaching the Shannon limit, Informationsverlust, abstrakte Entropie und die mathematische Beschreibung des zweiten Hauptsatzes der Thermodynamik, ON INFORMATION-PRESERVING TRANSFORMATIONS, A nonparametric approach to k-sample inference based on entropy, The origins of entropy and irreversibility, Ensemble-Dependent Bounds for Accessible Information in Quantum Mechanics, Entropic integrals of hyperspherical harmonics and spatial entropy of D-dimensional central potentials, Toeard a formulation of the human grasping quality sense, The MinMax information measure, Information gain within nonextensive thermostatistics, Quasi-Random Set Systems, The source-channel separation theorem revisited, On Unified (R,S)-Information Measures, MEASURING UNCERTAINTY IN ROUGH SET THEORY, COMPLETING A TOTAL UNCERTAINTY MEASURE IN THE DEMPSTER-SHAFER THEORY, The Asymptotic Equipartition Property for a Nonhomogeneous Markov Information Source, Entropy of flows, revisited, Invariant of dynamical systems: A generalized entropy, Unnamed Item, Fast multiple alignment of ungapped DNA sequences using information theory and a relaxation method, Asymptotic behaviour and statistical applications of divergence measures in multinomial populations: A unified study, An information theoretic model of a reliable computer network, The evolution of self-replicating computer organisms, Strategic entropy and complexity in repeated games, Multivariate exponential families and the Taguchi loss function, Further improved 2-D lattice filter structure employing missing reflection coefficients, Statistical applications of order \(\alpha\)-\(\beta\) weighted information energy, A theory of coarse utility, Reversible and endoreversible computing, Recovering information in the case of underdetermined problems and incomplete economic data, Entropy, divergence and distance measures with econometric applications, Two new optimal ternary two-weight codes and strongly regular graphs, Mathematical problems in cryptology, Generalization of an invertible mapping between probability and possibility, Intrinsic losses, An extension of Shannon-McMillan theorem and some limit properties for nonhomogeneous Markov chains, A mathematical basis for egress complexity, Axiomatic information measures depending only on a probability measure, Test for homogeneity of several populations by stochastic complexity, An SE-tree-based prime implicant generation algorithm, Dipole information complementarity in discrete 2D patterns., Dynamic design of a reliable computer network on using information theory, A new information-theoretic approach to the entropy of non-random discrete maps relation to fractal dimension and temperature of curves, Mutual information functions versus correlation functions., A simple information theory scheme for the dynamics of the Dicke model, Approximate reconstruction of single particle bound states and potentials from incomplete information, The Langevin and Fokker-Planck equations in the framework of a generalized statistical mechanics, A statistical mechanical approach to generalized statistics of quantum and classical gases, The physical nature of information, Exploratory analysis of empirical frequency distributions based on partition entropy, Akaike's information criterion and recent developments in information complexity, The maximum entropy approach in production frontier estimation, The \(\lambda\)-divergence and the \(\lambda\)-mutual information: Estimation in the stratified sampling, Fast String Matching in Stationary Ergodic Sources, Flat Minima, On the information-based measure of covariance complexity and its application to the evaluation of multivariate linear models, Sampling in a Hilbert space, Mathematical techniques for quantum communication theory, Unnamed Item, Unnamed Item, The Maximum Entropy Principle in Decision Making Under Uncertainty: Special Cases Applicable to Developing Technologies, Quasicyclic Symmetry and the Shannon Entropy, ENTROPY MEASURES UNDER SIMILARITY RELATIONS, Multiplier methods for engineering optimization, AN APPROACH TO THE ENTROPY OF (A OR B) VIA MAXIMIZATION OF CONDITIONAL ENTROPY, Informational Observation Theory: An Extension to Two Continuous Sets of Events, From Entropy of Fuzzy Sets to Fuzzy Set of Entropies: A Critical Review and New Results, Unnamed Item, Lattice functions, wavelet aliasing, and SO(3) mappings of orthonormal filters, Optimal Investment Strategy for Risky Assets, Some derivations of the Shannon entropy, The ø‐Entropy in the Selection of a Fixed Number of Experiments, On the asymptotic optimum allocation in estimating entropies, On the use of divergence statistics to make inferences about three habitats, Generalized Jensen difference divergence measures and Fisher measure of information, (h, ø)‐information measure as a criterion of comparison of experiments in a Bayesian context, Multichannel digital systems and Information theory II. Verification of an experimental result on the optical perception