Probabilistic learnability of context-free grammars with basic distributional properties from positive examples
From MaRDI portal
Publication:5964066
DOI10.1016/j.tcs.2015.10.037zbMath1335.68112OpenAlexW2219332833MaRDI QIDQ5964066
Ryo Yoshinaka, Chihiro Shibata
Publication date: 26 February 2016
Published in: Theoretical Computer Science (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.tcs.2015.10.037
Computational learning theory (68Q32) Formal languages and automata (68Q45) Grammars and rewriting systems (68Q42)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Distributional learning of parallel multiple context-free grammars
- The equivalence and inclusion problems for NTS languages
- NTS languages are deterministic and congruential
- Learning regular sets from queries and counterexamples
- Efficient learning of multiple context-free languages with multidimensional substitutability from positive data
- PAC Learning of Some Subclasses of Context-Free Grammars with Basic Distributional Properties from Positive Data
- Integration of the Dual Approaches in the Distributional Learning of Context-Free Grammars
- A Learnable Representation for Syntax Using Residuated Lattices
- Polynomial Time Probabilistic Learning of a Subclass of Linear Languages with Queries
- Identification in the Limit of k,l-Substitutable Context-Free Languages
- PAC-Learning Unambiguous NTS Languages
- Distributional Learning of Some Context-Free Languages with a Minimally Adequate Teacher
- Learning Context Free Grammars with the Syntactic Concept Lattice
- PAC-Learning Unambiguous k,l-NTS ≤ Languages
- A theory of the learnable
- Cryptographic limitations on learning Boolean formulae and finite automata
- Towards Dual Approaches for Learning Context-Free Grammars Based on Syntactic Concept Lattices
- Probability and Computing
- Grammatical Inference: Algorithms and Applications