On the role of procrastination in machine learning
From MaRDI portal
Recommendations
Cited in
(41)- On a generalized notion of mistake bounds
- Feasible Iteration of Feasible Learning Functionals
- Incremental Learning with Ordinal Bounded Example Memory
- Recursion theoretic models of learning: Some results and intuitions
- Calculating the mind change complexity of learning algebraic structures
- Iterative learning of simple external contextual languages
- Counting extensional differences in BC-learning
- On ordinal VC-dimension and some notions of complexity
- Parsimony hierarchies for inductive inference
- Learning algebraic structures from text
- Uncomputability: The problem of induction internalized
- Dynamically Delayed Postdictive Completeness and Consistency in Learning
- Learning with ordinal-bounded memory from positive data
- Borel-piecewise continuous reducibility for uniformization problems
- Ockham's razor, empirical complexity, and truth-finding efficiency
- Resource restricted computability theoretic learning: Illustrative topics and problems
- Classes with easily learnable subclasses
- Learning correction grammars
- Mind change efficient learning
- On the intrinsic complexity of learning recursive functions
- Learning by switching type of information.
- Program size complexity of correction grammars in the Ershov hierarchy
- Inside the Muchnik degrees. I: Discontinuity, learnability and constructivism
- On the data consumption benefits of accepting increased uncertainty
- Robust learning aided by context
- Ordinal mind change complexity of language identification
- Topological properties of concept spaces (full version)
- The logic of reliable and efficient inquiry
- On the classification of recursive languages
- Computable categoricity and the Ershov hierarchy
- Non-U-shaped vacillatory and team learning
- Probabilistic and team PFIN-type learning: General properties
- Mind change complexity of learning logic programs
- scientific article; zbMATH DE number 1416102 (Why is no real title available?)
- Topological Properties of Concept Spaces
- Mind change complexity of inferring unbounded unions of restricted pattern languages from positive data
- Incremental concept learning for bounded data mining.
- Generalized notions of mind change complexity
- Rice and Rice-Shapiro theorems for transfinite correction grammars
- General inductive inference types based on linearly-ordered sets
- Unifying logic, topology and learning in parametric logic
This page was built for publication: On the role of procrastination in machine learning
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1317427)