scientific article

From MaRDI portal
Revision as of 05:21, 6 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:4074808

zbMath0314.94019MaRDI QIDQ4074808

Peter Gács

Publication date: 1974


Title: zbMATH Open Web Interface contents unavailable due to conflicting licenses.




Related Items (48)

Kolmogorov complexity arguments in combinatoricsRandomness and reducibilityComputational depth and reducibilityA Note on Blum Static Complexity MeasuresThe Normalized Algorithmic Information Distance Can Not Be ApproximatedThe sum \(2^{KM(x)-K(x)}\) over all prefixes \(x\) of some binary sequence can be infiniteRelating and contrasting plain and prefix Kolmogorov complexityComputational depth and reducibilityCharacterising the Martin-Löf random sequences using computably enumerable sets of measure oneAlgorithmic arguments in physics of computationOn universal prediction and Bayesian confirmationAn almost machine-independent theory of program-length complexity, sophistication, and inductionOn generalized computable universal priors and their convergenceDimension spectra of lines1The discovery of algorithmic probabilityStatistical properties of finite sequences with high Kolmogorov complexityConditional Kolmogorov complexity and universal probabilityOn continued fraction randomness and normalityFixed point theorems on partial randomnessMartingales in the Study of RandomnessCHAITIN’S Ω AS A CONTINUOUS FUNCTIONA generalized characterization of algorithmic probabilityRandomness as an invariant for number representationsPreface: Taming randomness and complexity -- essays in honour of Professor Péter GácsAn additivity theorem for plain Kolmogorov complexityMilking the Aanderaa argumentRelations between varieties of kolmogorov complexitiesBounding the dimension of points on a lineProofs of conservation inequalities for Levin's notion of mutual information of 1974Algorithmic randomness of continuous functionsEntropy measures vs. Kolmogorov complexityAlgorithmic information theory and its statistical mechanical interpretationPrefix and plain Kolmogorov complexity characterizations of 2-randomness: simple proofsInductive reasoning and Kolmogorov complexityShort lists with short programs in short timeAverage case complexity under the universal distribution equals worst- case complexityRandom Continuous FunctionsProgram size complexity for possibly infinite computationsSymmetry of information and one-way functionsOn universal transfer learningRandomness and Effective Dimension of Continued Fractions.Computable Bayesian Compression for Uniformly Discretizable Statistical ModelsUniform test of algorithmic randomness over a general spaceKolmogorov Complexity in Perspective Part I: Information Theory and RandomnessOn the relation between descriptional complexity and algorithmic probabilityEntropic measures, Markov information sources and complexitySequential predictions based on algorithmic complexityThinking with notations: epistemic actions and epistemic activities in mathematical practice







This page was built for publication: