Parallelism increases iterative learning power
From MaRDI portal
Publication:1017664
DOI10.1016/j.tcs.2009.01.015zbMath1167.68024OpenAlexW2175108584MaRDI QIDQ1017664
John Case, Samuel E. III Moelius
Publication date: 12 May 2009
Published in: Theoretical Computer Science (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.tcs.2009.01.015
computational learning theoryparallelisminductive inferencelanguage learningiterative learningparallel learningGold-style learningmemory limited learning
Related Items (3)
Strongly non-U-shaped language learning results by general techniques ⋮ Iterative learning of simple external contextual languages ⋮ Towards a map for incremental learning in the limit from positive and negative information
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A machine discovery from amino acid sequences by decision trees over regular patterns
- Polynomial-time inference of arbitrary pattern languages
- U-shaped, iterative, and iterative-with-counter learning
- Finding patterns common to a set of strings
- Incremental learning from positive data
- Incremental concept learning for bounded data mining.
- Language learning from texts: Mindchanges, limited memory and monotonicity
- Results on memory-limited U-shaped learning
- Parallelism Increases Iterative Learning Power
- Learning Left-to-Right and Right-to-Left Iterative Languages
- Periodicity in generations of automata
- Infinitary self-reference in learning theory
- Language identification in the limit
- Predictive learning models for concept drift
This page was built for publication: Parallelism increases iterative learning power