The Power of Vacillation in Language Learning
From MaRDI portal
Publication:4268851
DOI10.1137/S0097539793249694zbMATH Open0939.68099OpenAlexW2002485318MaRDI QIDQ4268851FDOQ4268851
Publication date: 28 October 1999
Published in: SIAM Journal on Computing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/s0097539793249694
Learning and adaptive systems in artificial intelligence (68T05) Formal languages and automata (68Q45)
Cited In (44)
- On the role of update constraints and text-types in iterative learning
- Numberings optimal for learning
- The synthesis of language learners.
- Variations on U-shaped learning
- Intrinsic complexity of learning geometrical concepts from positive data
- Hypothesis spaces for learning
- When unlearning helps
- U-shaped, iterative, and iterative-with-counter learning
- Learnability and positive equivalence relations
- Computability-theoretic learning complexity
- Learning in Friedberg numberings
- Prescribed learning of r.e. classes
- Infinitary self-reference in learning theory
- Effectivity questions for Kleene's recursion theorem
- Learning correction grammars
- Resource restricted computability theoretic learning: Illustrative topics and problems
- Learning by switching type of information.
- Uncountable automatic classes and learning
- Uncountable Automatic Classes and Learning
- Secretive interaction. Players and strategies
- Prescribed Learning of R.E. Classes
- Effectivity Questions for Kleene’s Recursion Theorem
- Results on memory-limited U-shaped learning
- Gold-Style Learning Theory
- Learning theory in the arithmetic hierarchy. II.
- Machine induction without revolutionary paradigm shifts
- Learnability: Admissible, co-finite, and hypersimple languages
- Strongly non-U-shaped language learning results by general techniques
- Learning in Friedberg Numberings
- Synthesizing learners tolerating computable noisy data
- Learnability and positive equivalence relations
- Hypothesis Spaces for Learning
- Topological separations in inductive inference
- Vacillatory and BC learning on noisy data
- Non-U-shaped vacillatory and team learning
- Repeating and remembering foreign language words: Implications for language teaching systems
- Prudence in vacillatory language identification
- Learnability of automatic classes
- Numberings Optimal for Learning
- Maximal machine learnable classes
- On aggregating teams of learning machines
- Learning in the presence of inaccurate information
- Incremental concept learning for bounded data mining.
- On the non-existence of maximal inference degrees for language identification
This page was built for publication: The Power of Vacillation in Language Learning
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4268851)