Learning languages in the limit from positive information with finitely many memory changes
From MaRDI portal
Publication:2117794
DOI10.1007/978-3-030-80049-9_29OpenAlexW3186664975MaRDI QIDQ2117794
Publication date: 22 March 2022
Full work available at URL: https://arxiv.org/abs/2010.04782
(strongly) non-U-shaped learningmap for bounded memory states learnersmemory restricted learning algorithms
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On the role of update constraints and text-types in iterative learning
- A map of update constraints in inductive inference
- Strongly non-U-shaped language learning results by general techniques
- Optimal language learning from positive data
- When unlearning helps
- Classical recursion theory. Vol. II
- Incremental learning from positive data
- Incremental concept learning for bounded data mining.
- Memory-limited non-U-shaped learning with solved open problems
- Learning without coding
- Results on memory-limited U-shaped learning
- Some natural conditions on incremental learning
- Learning strategies
- Inductive inference of formal languages from positive data
- Periodicity in generations of automata
- Toward a mathematical theory of inductive inference
- Infinitary self-reference in learning theory
- U-Shaped, Iterative, and Iterative-with-Counter Learning
- Language identification in the limit
This page was built for publication: Learning languages in the limit from positive information with finitely many memory changes