A formal theory of inductive inference. Part I
From MaRDI portal
Publication:5674429
DOI10.1016/S0019-9958(64)90223-2zbMath0258.68045OpenAlexW4213350211WikidataQ54266495 ScholiaQ54266495MaRDI QIDQ5674429
Publication date: 1964
Published in: Information and Control (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/s0019-9958(64)90223-2
Related Items
Kolmogorov complexity based upper bounds for the unsatisfiability threshold of random k-SAT ⋮ A Note on Blum Static Complexity Measures ⋮ Computational depth and reducibility ⋮ The Whole and the Parts: The Minimum Description Length Principle and the A-Contrario Framework ⋮ Reflective Oracles: A Foundation for Game Theory in Artificial Intelligence ⋮ A test for randomness based on a complexity measure ⋮ On the Influence of Technology on Learning Processes ⋮ DEGREES OF RANDOMIZED COMPUTABILITY ⋮ Gacs quantum algorithmic entropy in infinite dimensional Hilbert spaces ⋮ Introduction: computability of the physical ⋮ Algorithmic thermodynamics ⋮ Schnorr randomness ⋮ Algorithmic complexity of points in dynamical systems ⋮ Characterization of language learning front informant under various monotonicity constraints ⋮ Ignoring data may be the only way to learn efficiently ⋮ Generalized kolmogorov complexity and other dual complexity measures ⋮ Competitive On-line Statistics ⋮ Recursive computational depth ⋮ HOW DIFFICULT IS IT TO INVENT A NONTRIVIAL GAME? ⋮ Universality probability of a prefix-free machine ⋮ A unified approach to the definition of random sequences ⋮ PROBLEMS WITH COMPLEXITY IN GOLD'S PARADIGM OF INDUCTION Part I: Dynamic Complexity ⋮ PROBLEMS WITH COMPLEXITY IN GOLD'S PARADIGM OF INDUCTION Part II: Static Complexity ⋮ Constructive reinforcement learning ⋮ A NOTE ON THE LEARNING-THEORETIC CHARACTERIZATIONS OF RANDOMNESS AND CONVERGENCE ⋮ Alien coding ⋮ Kolmogorov and mathematical logic ⋮ A circuit complexity formulation of algorithmic information theory ⋮ Martingales in the Study of Randomness ⋮ The Kolmogorov birthday paradox ⋮ On initial segment complexity and degrees of randomness ⋮ One-way functions and the hardness of (probabilistic) time-bounded Kolmogorov complexity w.r.t. samplable distributions ⋮ HIGHER RANDOMNESS AND GENERICITY ⋮ Kolmogorov's Last Discovery? (Kolmogorov and Algorithmic Statistics) ⋮ Comparing descriptional and computational complexity of infinite words ⋮ Algorithmic Statistics: Forty Years Later ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Relations between varieties of kolmogorov complexities ⋮ Face Representations via Tensorfaces of Various Complexities ⋮ Foundations of Support Constraint Machines ⋮ Optimal enumerations and optimal gödel numberings ⋮ The Quest for Uncertainty ⋮ An incompressibility theorem for automatic complexity ⋮ Macrodynamic Cooperative Complexity of Information Dynamics ⋮ Quantum Algorithmic Complexities and Entropy ⋮ On the computability of a construction of Brownian motion ⋮ SYSTEM IDENTIFICATION, APPROXIMATION AND COMPLEXITY ⋮ THE FASTEST AND SHORTEST ALGORITHM FOR ALL WELL-DEFINED PROBLEMS ⋮ HIERARCHIES OF GENERALIZED KOLMOGOROV COMPLEXITIES AND NONENUMERABLE UNIVERSAL MEASURES COMPUTABLE IN THE LIMIT ⋮ Schnorr Randomness ⋮ Trivial Reals ⋮ Estimating Entropy Rates with Bayesian Confidence Intervals ⋮ On Martin-Löf Convergence of Solomonoff’s Mixture ⋮ A new method for sparsity control in support vector classification and regression ⋮ Recursively enumerable reals and Chaitin \(\Omega\) numbers ⋮ Stochastic complexity and the mdl principle ⋮ New error bounds for Solomonoff prediction ⋮ Relations between information criteria for model-structure selection Part 2. Modelling by shortest data description ⋮ Learners based on transducers ⋮ Descriptive complexity of computable sequences ⋮ Artificial sequences and complexity measures ⋮ On the Kolmogorov Complexity of Continuous Real Functions ⋮ Kolmogorov Complexity in Perspective Part I: Information Theory and Randomness ⋮ Quantitative limits on the ability of a Maxwell demon to extract work from heat ⋮ A geometric approach to complexity ⋮ Statistical learning theory, model identification and system information content ⋮ The universal path integral ⋮ On the possibility of basing cryptography on \(\mathsf{EXP}\ne \mathsf{BPP} \) ⋮ Complexity analysis to explore the structure of ancient stromatolites ⋮ Randomness and reducibility ⋮ Generation of symmetric exponential sums ⋮ Relating and contrasting plain and prefix Kolmogorov complexity ⋮ Towards a new theory of confirmation ⋮ The dimensions of individual strings and sequences ⋮ On the problem of stable image restoration ⋮ Chaos dynamics executes inductive inference ⋮ Occam bound on lowest complexity of elements ⋮ Sophistication revisited ⋮ Prediction of infinite words with automata ⋮ Large alphabets and incompressibility ⋮ Simultaneous predictive Gaussian classifiers ⋮ On semimeasures predicting Martin-Löf random sequences ⋮ The representation and manipulation of the algorithmic probability measure for problem solving. ⋮ The Kolmogorov complexity of infinite words ⋮ An almost machine-independent theory of program-length complexity, sophistication, and induction ⋮ Streaming generalized cross entropy ⋮ Tape versus queue and stacks: The lower bounds ⋮ Algorithmic complexity bounds on future prediction errors ⋮ The discovery of algorithmic probability ⋮ Stochastic complexity in learning ⋮ Entropy and algorithmic complexity in quantum information theory ⋮ A philosophical treatise of universal induction ⋮ On measuring the complexity of networks: Kolmogorov complexity versus entropy ⋮ Disentangling complexity from randomness and chaos ⋮ A catalog of Boolean concepts. ⋮ Towards a theory of chance - Part II ⋮ Predicting non-stationary processes ⋮ The generalized universal law of generalization. ⋮ Information theory: A multifaceted model of information ⋮ One-way functions using algorithmic and classical information theories ⋮ Complexity of algorithms and computations ⋮ Elementary differences between the degrees of unsolvability and degrees of compressibility ⋮ Real patterns and indispensability ⋮ On the computability of Solomonoff induction and AIXI ⋮ An information-theoretic approach to time bounds for on-line computation ⋮ Some theorems on the algorithmic approach to probability theory and information theory (1971 dissertation directed by A. N. Kolmogorov) ⋮ Theory construction in psychology: The interpretation and integration of psychological data ⋮ Absolutely no free lunches! ⋮ Minimum message length encoding and the comparison of macromolecules ⋮ Low-depth witnesses are easy to find ⋮ Constraints placed on random sequences by their compressibility ⋮ Simplicity and likelihood: an axiomatic approach ⋮ An extended coding theorem with application to quantum complexities ⋮ Dynamics of inductive inference in a unified framework ⋮ Learning recursive functions: A survey ⋮ Absolute versus probabilistic classification in a logical setting ⋮ On the number of infinite sequences with trivial initial segment complexity ⋮ Analogical proportions: from equality to inequality ⋮ Entropy measures vs. Kolmogorov complexity ⋮ Algorithmic relative complexity ⋮ Entropy and quantum Kolmogorov complexity: a quantum Brudno's theorem ⋮ Why frequentists and Bayesians need each other ⋮ Model selection based on minimum description length ⋮ \(P\)-sufficient statistics for PAC learning \(k\)-term-DNF formulas through enumeration ⋮ Prefix and plain Kolmogorov complexity characterizations of 2-randomness: simple proofs ⋮ Universal forecasting algorithms ⋮ Algorithmic information and simplicity in statistical physics ⋮ Compressibility, laws of nature, initial conditions and complexity ⋮ New theory about old evidence. A framework for open-minded Bayesianism ⋮ Sophistication vs logical depth ⋮ On empirical meaning of randomness with respect to parametric families of probability distributions ⋮ Some non-conventional ideas about algorithmic complexity ⋮ Evolutionary induction of stochastic context free grammars ⋮ Infotropism as the underlying principle of perceptual organization ⋮ Process complexity and effective random tests ⋮ Information measures for infinite sequences ⋮ Does the polynomial hierarchy collapse if onto functions are invertible? ⋮ On the computational power of random strings ⋮ Predictability, Complexity, and Learning ⋮ Microscopic reversibility and macroscopic irreversibility: from the viewpoint of algorithmic randomness ⋮ On the inference of optimal descriptions ⋮ Algorithmic Probability and Friends. Bayesian Prediction and Artificial Intelligence. By D.L. Dowe, Berlin: Springer. 2013. 445 pp. £62.00 (softcover). ISBN 978-3-642-44957-4 ⋮ Model discrimination using an algorithmic information criterion ⋮ Hydrozip: how hydrological knowledge can be used to improve compression of hydrological data ⋮ Have I seen you before? Principles of Bayesian predictive classification revisited ⋮ On calibration error of randomized forecasting algorithms ⋮ A theory of incremental compression ⋮ Theory of chaos and its application to the crisis of debts and the origin of inflation ⋮ Solomonoff Induction Violates Nicod’s Criterion ⋮ On the Computability of Solomonoff Induction and Knowledge-Seeking ⋮ The teaching size: computable teachers and learners for universal languages ⋮ Certain aspects of graphic regularity ⋮ On the application of algorithmic information theory to decision problems ⋮ Is The theory of everything merely the ultimate ensemble theory? ⋮ Uniform test of algorithmic randomness over a general space ⋮ Algorithmic information dynamics of cellular automata ⋮ Applying MDL to learn best model granularity ⋮ On the relation between descriptional complexity and algorithmic probability ⋮ Sequential predictions based on algorithmic complexity ⋮ Thinking with notations: epistemic actions and epistemic activities in mathematical practice ⋮ On the syntactic structure of protein sequences and the concept of grammar complexity ⋮ On Martin-Löf (non-)convergence of Solomonoff's universal mixture ⋮ PAC-learning gains of Turing machines over circuits and neural networks ⋮ A comparison of two approaches to pseudorandomness ⋮ Cryptography and algorithmic randomness ⋮ Similarity, kernels, and the fundamental constraints on cognition