Logical basis for information theory and probability theory
From MaRDI portal
Publication:5554774
DOI10.1109/TIT.1968.1054210zbMath0167.47601WikidataQ56038298 ScholiaQ56038298MaRDI QIDQ5554774
No author found.
Publication date: 1968
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Related Items (78)
Complementarity of information obtained by Kolmogorov and Aksentijevic-Gibson complexities in the analysis of binary time series ⋮ Conformal Prediction: A Gentle Introduction ⋮ Entropy estimation of symbol sequences ⋮ Information entropy as a basic building block of complexity theory ⋮ Counting probability distributions: Differential geometry and model selection ⋮ Computational depth and reducibility ⋮ Computational depth and reducibility ⋮ Non-Algorithmic Theory of Randomness ⋮ COMPUTERIZED METHODOLOGY FOR THE EVALUATION OF LEVEL OF KNOWLEDGE ⋮ A test for randomness based on a complexity measure ⋮ Inequalities for space-bounded Kolmogorov complexity ⋮ Concentration theorems for entropy and free energy ⋮ What is quantum information? ⋮ Well-calibrated predictions from on-line compression models ⋮ The discovery of algorithmic probability ⋮ Information geometric methods for complexity ⋮ Clustering with respect to the information distance ⋮ INFORMATION IN PROPOSITIONAL PROOFS AND ALGORITHMIC PROOF SEARCH ⋮ Universality probability of a prefix-free machine ⋮ Entropy and algorithmic complexity in quantum information theory ⋮ On continued fraction randomness and normality ⋮ Reexamination of an information geometric construction of entropic indicators of complexity ⋮ Bayesian definition of random sequences with respect to conditional probabilities ⋮ Kolmogorov and mathematical logic ⋮ Inexactness and a future of computing ⋮ Martingales in the Study of Randomness ⋮ Andrei Kolmogorov and Leonid Levin on Randomness ⋮ A measure of shared information in classes of patterns ⋮ On the computational complexity of infinite words. ⋮ Unnamed Item ⋮ Inequalities for entropies and dimensions ⋮ Kolmogorov's Last Discovery? (Kolmogorov and Algorithmic Statistics) ⋮ On grammars, complexity, and information measures of biological macromolecules ⋮ Comparing descriptional and computational complexity of infinite words ⋮ An observer's information dynamics: acquisition of information and the origin of the cognitive dynamics ⋮ An additivity theorem for plain Kolmogorov complexity ⋮ Relationship of second-order lacunarity, Hurst exponent, Brownian motion, and pattern organization ⋮ Quantifying the complexity of geodesic paths on curved statistical manifolds through information geometric entropies and Jacobi fields ⋮ Deflating the deflationary view of information ⋮ Some theorems on the algorithmic approach to probability theory and information theory (1971 dissertation directed by A. N. Kolmogorov) ⋮ Theoretical investigations of an information geometric approach to complexity ⋮ Minimal-program complexity of pseudo-recursive and pseudo-random sequences ⋮ Limit complexities revisited ⋮ Quantitative and qualitative growth analysis ⋮ Multiplex signal transmission and the development of sampling techniques: the work of Herbert Raabe in contrast to that of Claude Shannon ⋮ Entropy and quantum Kolmogorov complexity: a quantum Brudno's theorem ⋮ Organization by rules in finite sequences ⋮ Accuracy, scope, and flexibility of models ⋮ The importance of complexity in model selection ⋮ Traffic grammar and algorithmic complexity in urban freeway flow patterns ⋮ What is Shannon information? ⋮ Quantum Algorithmic Complexities and Entropy ⋮ The emergence of meaning at the co-evolutionary level: an epistemological approach ⋮ SYSTEM IDENTIFICATION, APPROXIMATION AND COMPLEXITY ⋮ What one has to know when attacking \(\mathsf{P}\) vs.\(\mathsf{NP}\) ⋮ Regression Estimation from an Individual Stable Sequence ⋮ A theory of information structure I. General principles ⋮ Liouville, computable, Borel normal and Martin-Löf random numbers ⋮ Mathematics as information compression via the matching and unification of patterns ⋮ Randomness, independence, and hypotheses ⋮ Algorithmic tests and randomness with respect to a class of measures ⋮ Model discrimination using an algorithmic information criterion ⋮ One or Many Concepts of Information? ⋮ SYSTEMS AND DISTINCTIONS; DUALITY AND COMPLEMENT ARITY† ⋮ Randomness and Effective Dimension of Continued Fractions. ⋮ Mathematical metaphysics of randomness ⋮ Ergodic theorems for individual random sequences ⋮ A spectrum of compromise aggregation operators for multi-attribute decision making ⋮ Inequalities for Shannon entropy and Kolmogorov complexity ⋮ Measures of ignorance, information and uncertainty. I ⋮ Testing randomness online ⋮ On the application of algorithmic information theory to decision problems ⋮ A quantitative Occam's razor ⋮ On the relation between descriptional complexity and algorithmic probability ⋮ Thinking with notations: epistemic actions and epistemic activities in mathematical practice ⋮ A study of the fractal character in electronic noise processes ⋮ Uncontrollable computational growth in theoretical physics ⋮ Links between physics and set theory.
This page was built for publication: Logical basis for information theory and probability theory