Parallelism increases iterative learning power
From MaRDI portal
Recommendations
Cites work
- scientific article; zbMATH DE number 439891 (Why is no real title available?)
- scientific article; zbMATH DE number 3817068 (Why is no real title available?)
- scientific article; zbMATH DE number 3932417 (Why is no real title available?)
- scientific article; zbMATH DE number 67629 (Why is no real title available?)
- scientific article; zbMATH DE number 3539202 (Why is no real title available?)
- scientific article; zbMATH DE number 2077161 (Why is no real title available?)
- scientific article; zbMATH DE number 743586 (Why is no real title available?)
- scientific article; zbMATH DE number 3291134 (Why is no real title available?)
- A machine discovery from amino acid sequences by decision trees over regular patterns
- Finding patterns common to a set of strings
- Incremental concept learning for bounded data mining.
- Incremental learning from positive data
- Infinitary self-reference in learning theory
- Language identification in the limit
- Language learning from texts: Mindchanges, limited memory and monotonicity
- Learning Left-to-Right and Right-to-Left Iterative Languages
- Parallelism Increases Iterative Learning Power
- Periodicity in generations of automata
- Polynomial-time inference of arbitrary pattern languages
- Predictive learning models for concept drift
- Results on memory-limited U-shaped learning
- U-shaped, iterative, and iterative-with-counter learning
Cited in
(5)- Parallelism Increases Iterative Learning Power
- Iterative learning of simple external contextual languages
- Widening: using parallel resources to improve model quality
- Strongly non-U-shaped language learning results by general techniques
- Towards a map for incremental learning in the limit from positive and negative information
This page was built for publication: Parallelism increases iterative learning power
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1017664)