On the computational complexity of approximating distributions by probabilistic automata
From MaRDI portal
Publication:1207305
zbMath0766.68106MaRDI QIDQ1207305
Publication date: 1 April 1993
Published in: Machine Learning (Search for Journal in Brave)
computational learning theoryhidden Markov modelsKullback-Leibler divergencespeech recognitiondensity estimationPAC learning model
Analysis of algorithms and problem complexity (68Q25) Learning and adaptive systems in artificial intelligence (68T05) Formal languages and automata (68Q45)
Related Items
The consensus string problem and the complexity of comparing hidden Markov models. ⋮ Learning of Structurally Unambiguous Probabilistic Grammars ⋮ Learning deterministic context free grammars: the Omphalos competition ⋮ Recent advances of grammatical inference ⋮ Grammatical inference: An old and new paradigm ⋮ Learning Probability Distributions Generated by Finite-State Machines ⋮ Generalization bounds for learning weighted automata ⋮ Regular inference as vertex coloring ⋮ \textsc{PAutomaC}: a probabilistic automata and hidden Markov models learning competition ⋮ A comparison of collapsed Bayesian methods for probabilistic finite automata ⋮ PAC-learnability of probabilistic deterministic finite state automata in terms of variation distance ⋮ The power of amnesia: Learning probabilistic automata with variable memory length ⋮ Learning nonsingular phylogenies and hidden Markov models ⋮ A multi-parameter analysis of hard problems on deterministic finite automata ⋮ Links between probabilistic automata and hidden Markov models: probability distributions, learning models and induction algorithms ⋮ On the learnability and usage of acyclic probabilistic finite automata ⋮ On the Rademacher Complexity of Weighted Automata