A formal theory of inductive inference. Part II

From MaRDI portal
Publication:5674994

DOI10.1016/S0019-9958(64)90131-7zbMath0259.68038WikidataQ29042423 ScholiaQ29042423MaRDI QIDQ5674994

Ray J. Solomonoff

Publication date: 1964

Published in: Information and Control (Search for Journal in Brave)




Related Items

Martingales in the Study of Randomness, Kolmogorov's Last Discovery? (Kolmogorov and Algorithmic Statistics), A LEARNING-THEORETIC CHARACTERISATION OF MARTIN-LÖF RANDOMNESS AND SCHNORR RANDOMNESS, The universal path integral, Predictive stochastic complexity and model estimation for finite-state processes, Complexity analysis to explore the structure of ancient stromatolites, Randomness and reducibility, Computational depth and reducibility, A Note on Blum Static Complexity Measures, The structural complexity of DNA templates -- implications on cellular complexity, The Whole and the Parts: The Minimum Description Length Principle and the A-Contrario Framework, UNIVERSAL CODING AND PREDICTION ON ERGODIC RANDOM POINTS, The Kolmogorov complexity of random reals, On Hausdorff and topological dimensions of the Kolmogorov complexity of the real line, On the Influence of Technology on Learning Processes, On universal prediction and Bayesian confirmation, `Ideal learning' of natural language: positive results about learning from positive evidence, Induction: a logical analysis, Individual communication complexity, DEGREES OF RANDOMIZED COMPUTABILITY, Every 2-random real is Kolmogorov random, What is quantum information?, Algorithmic complexity of quantum capacity, On generalized computable universal priors and their convergence, Dimension spectra of lines1, The discovery of algorithmic probability, Open problems in universal induction \& intelligence, Obituary: Ray Solomonoff, founding father of algorithmic information theory, A complete theory of everything (will be subjective), A network of autoregressive processing units for time series modeling, The evolution of human communication and the information revolution --- A mathematical perspective, Chaitin's omega and an algorithmic phase transition, Information dissipation in quantum-chaotic systems: Computational view and measurement induction, PAC learning of concept classes through the boundaries of their items, A NOTE ON THE LEARNING-THEORETIC CHARACTERIZATIONS OF RANDOMNESS AND CONVERGENCE, DECISION TREES DO NOT GENERALIZE TO NEW VARIATIONS, Strict process machine complexity, A computable measure of algorithmic probability by finite approximations with an application to integer sequences, A generalized characterization of algorithmic probability, Research in the theory of inductive inference by GDR mathematicians - A survey, Real patterns and indispensability, Deflating the deflationary view of information, Algorithmic Statistics: Forty Years Later, Analogies and theories: the role of simplicity and the emergence of norms, Unnamed Item, Information-geometric approach to inferring causal directions, Learning recursive functions: A survey, Algorithmic complexity of recursive and inductive algorithms, Random languages for nonuniform complexity classes, Algorithmic information theory and its statistical mechanical interpretation, Inductive logic programming, The Quest for Uncertainty, Monotonic and dual monotonic language learning, Fractal dimension versus process complexity, An incompressibility theorem for automatic complexity, Almost everywhere high nonuniform complexity, Inductive reasoning and Kolmogorov complexity, An Information-Geometric Approach to Learning Bayesian Network Topologies from Data, Identification of probabilities, What is Shannon information?, Quantum Algorithmic Complexities and Entropy, On the computability of a construction of Brownian motion, Average case complexity under the universal distribution equals worst- case complexity, Justifying Additive Noise Model-Based Causal Discovery via Algorithmic Information Theory, Bicompletions of Distance Matrices, Circuit size relative to pseudorandom oracles, Explanatory and creative alternatives to the MDL principle, A mathematical theory of learning transformational grammar, Unnamed Item, Process and truth-table characterisations of randomness, Predictability: a way to characterize complexity, A theory of information structure I. General principles, Correlation of automorphism group size and topological properties with program-size complexity evaluations of graphs and complex networks, Learners based on transducers, Mathematics as information compression via the matching and unification of patterns, Inference for regular bilanguages, Microscopic reversibility and macroscopic irreversibility: from the viewpoint of algorithmic randomness, Descriptive complexity of computable sequences revisited, Optimal asymptotic bounds on the oracle use in computations from Chaitin's Omega, Sequential fuzzy system identification, Symmetry of information and one-way functions, Artificial sequences and complexity measures, Putnam's diagonal argument and the impossibility of a universal learning machine, A theory of incremental compression, On the Kolmogorov Complexity of Continuous Real Functions, The calculi of emergence: Computation, dynamics and induction, Enumerations of the Kolmogorov function, On the inference of Turing machines from sample computations, Degrees of monotone complexity, Symbolic dynamics of one-dimensional maps: Entropies, finite precision, and noise, Kolmogorov Complexity in Perspective Part I: Information Theory and Randomness, Information and complexity, or: where is the information?, Quantitative limits on the ability of a Maxwell demon to extract work from heat, Observations on Computability, Uncertainty, and Technology, Recursive computational depth., Thinking with notations: epistemic actions and epistemic activities in mathematical practice, An upward measure separation theorem, PAC-learning gains of Turing machines over circuits and neural networks, Convergence rates for the minimum complexity estimator of counting process intensities, Algorithmic analysis of irrational rotations in a single neuron model, Predictions and algorithmic statistics for infinite sequences, Data compression and learning in time sequences analysis