Towards a map for incremental learning in the limit from positive and negative information
From MaRDI portal
Publication:2117790
Recommendations
Cites work
- scientific article; zbMATH DE number 3539202 (Why is no real title available?)
- scientific article; zbMATH DE number 517031 (Why is no real title available?)
- scientific article; zbMATH DE number 517033 (Why is no real title available?)
- scientific article; zbMATH DE number 680708 (Why is no real title available?)
- scientific article; zbMATH DE number 3291134 (Why is no real title available?)
- A map of update constraints in inductive inference
- Classical recursion theory. Vol. II
- Incremental concept learning for bounded data mining.
- Incremental learning from positive data
- Inductive inference of automata, functions and programs
- Inductive inference of formal languages from positive data
- Infinitary self-reference in learning theory
- Language identification in the limit
- Learning indexed families of recursive languages from positive data: A survey
- Learning strategies
- Learning without coding
- Normal forms in semantic language identification
- On the role of update constraints and text-types in iterative learning
- Optimal language learning from positive data
- Parallelism increases iterative learning power
- Periodicity in generations of automata
- Results on memory-limited U-shaped learning
- Some natural conditions on incremental learning
- Strongly non-U-shaped language learning results by general techniques
- Toward a mathematical theory of inductive inference
- Towards an atlas of computational learning theory
- U-shaped, iterative, and iterative-with-counter learning
- When unlearning helps
This page was built for publication: Towards a map for incremental learning in the limit from positive and negative information
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2117790)