Learning languages in the limit from positive information with finitely many memory changes
From MaRDI portal
Publication:2117794
Recommendations
- Language learning from texts: Mindchanges, limited memory and monotonicity
- Iterative Learning from Positive Data and Negative Counterexamples
- Memory-limited non-U-shaped learning with solved open problems
- Iterative learning from positive data and negative counterexamples
- scientific article; zbMATH DE number 512873
Cites work
- scientific article; zbMATH DE number 517031 (Why is no real title available?)
- scientific article; zbMATH DE number 517033 (Why is no real title available?)
- scientific article; zbMATH DE number 680708 (Why is no real title available?)
- A map of update constraints in inductive inference
- Classical recursion theory. Vol. II
- Incremental concept learning for bounded data mining.
- Incremental learning from positive data
- Inductive inference of formal languages from positive data
- Infinitary self-reference in learning theory
- Language identification in the limit
- Learning strategies
- Learning without coding
- Memory-limited non-U-shaped learning with solved open problems
- Normal forms in semantic language identification
- On the role of update constraints and text-types in iterative learning
- Optimal language learning from positive data
- Periodicity in generations of automata
- Results on memory-limited U-shaped learning
- Some natural conditions on incremental learning
- Strongly non-U-shaped language learning results by general techniques
- Toward a mathematical theory of inductive inference
- Towards an atlas of computational learning theory
- U-Shaped, Iterative, and Iterative-with-Counter Learning
- When unlearning helps
This page was built for publication: Learning languages in the limit from positive information with finitely many memory changes
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2117794)