Logical basis for information theory and probability theory

From MaRDI portal
Revision as of 03:35, 7 March 2024 by Import240305080351 (talk | contribs) (Created automatically from import240305080351)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:5554774

DOI10.1109/TIT.1968.1054210zbMath0167.47601WikidataQ56038298 ScholiaQ56038298MaRDI QIDQ5554774

No author found.

Publication date: 1968

Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)




Related Items (78)

Complementarity of information obtained by Kolmogorov and Aksentijevic-Gibson complexities in the analysis of binary time seriesConformal Prediction: A Gentle IntroductionEntropy estimation of symbol sequencesInformation entropy as a basic building block of complexity theoryCounting probability distributions: Differential geometry and model selectionComputational depth and reducibilityComputational depth and reducibilityNon-Algorithmic Theory of RandomnessCOMPUTERIZED METHODOLOGY FOR THE EVALUATION OF LEVEL OF KNOWLEDGEA test for randomness based on a complexity measureInequalities for space-bounded Kolmogorov complexityConcentration theorems for entropy and free energyWhat is quantum information?Well-calibrated predictions from on-line compression modelsThe discovery of algorithmic probabilityInformation geometric methods for complexityClustering with respect to the information distanceINFORMATION IN PROPOSITIONAL PROOFS AND ALGORITHMIC PROOF SEARCHUniversality probability of a prefix-free machineEntropy and algorithmic complexity in quantum information theoryOn continued fraction randomness and normalityReexamination of an information geometric construction of entropic indicators of complexityBayesian definition of random sequences with respect to conditional probabilitiesKolmogorov and mathematical logicInexactness and a future of computingMartingales in the Study of RandomnessAndrei Kolmogorov and Leonid Levin on RandomnessA measure of shared information in classes of patternsOn the computational complexity of infinite words.Unnamed ItemInequalities for entropies and dimensionsKolmogorov's Last Discovery? (Kolmogorov and Algorithmic Statistics)On grammars, complexity, and information measures of biological macromoleculesComparing descriptional and computational complexity of infinite wordsAn observer's information dynamics: acquisition of information and the origin of the cognitive dynamicsAn additivity theorem for plain Kolmogorov complexityRelationship of second-order lacunarity, Hurst exponent, Brownian motion, and pattern organizationQuantifying the complexity of geodesic paths on curved statistical manifolds through information geometric entropies and Jacobi fieldsDeflating the deflationary view of informationSome theorems on the algorithmic approach to probability theory and information theory (1971 dissertation directed by A. N. Kolmogorov)Theoretical investigations of an information geometric approach to complexityMinimal-program complexity of pseudo-recursive and pseudo-random sequencesLimit complexities revisitedQuantitative and qualitative growth analysisMultiplex signal transmission and the development of sampling techniques: the work of Herbert Raabe in contrast to that of Claude ShannonEntropy and quantum Kolmogorov complexity: a quantum Brudno's theoremOrganization by rules in finite sequencesAccuracy, scope, and flexibility of modelsThe importance of complexity in model selectionTraffic grammar and algorithmic complexity in urban freeway flow patternsWhat is Shannon information?Quantum Algorithmic Complexities and EntropyThe emergence of meaning at the co-evolutionary level: an epistemological approachSYSTEM IDENTIFICATION, APPROXIMATION AND COMPLEXITYWhat one has to know when attacking \(\mathsf{P}\) vs.\(\mathsf{NP}\)Regression Estimation from an Individual Stable SequenceA theory of information structure I. General principlesLiouville, computable, Borel normal and Martin-Löf random numbersMathematics as information compression via the matching and unification of patternsRandomness, independence, and hypothesesAlgorithmic tests and randomness with respect to a class of measuresModel discrimination using an algorithmic information criterionOne or Many Concepts of Information?SYSTEMS AND DISTINCTIONS; DUALITY AND COMPLEMENT ARITY†Randomness and Effective Dimension of Continued Fractions.Mathematical metaphysics of randomnessErgodic theorems for individual random sequencesA spectrum of compromise aggregation operators for multi-attribute decision makingInequalities for Shannon entropy and Kolmogorov complexityMeasures of ignorance, information and uncertainty. ITesting randomness onlineOn the application of algorithmic information theory to decision problemsA quantitative Occam's razorOn the relation between descriptional complexity and algorithmic probabilityThinking with notations: epistemic actions and epistemic activities in mathematical practiceA study of the fractal character in electronic noise processesUncontrollable computational growth in theoretical physicsLinks between physics and set theory.




This page was built for publication: Logical basis for information theory and probability theory