On the role of procrastination in machine learning
From MaRDI portal
Publication:1317427
DOI10.1006/inco.1993.1068zbMath0794.68127OpenAlexW1999736502MaRDI QIDQ1317427
Carl H. Smith, Rūsiņš Freivalds
Publication date: 17 April 1994
Published in: Information and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1006/inco.1993.1068
Related Items (39)
Generalized notions of mind change complexity ⋮ Counting extensional differences in BC-learning ⋮ On the data consumption benefits of accepting increased uncertainty ⋮ Inside the Muchnik degrees. I: Discontinuity, learnability and constructivism ⋮ Ockham's razor, empirical complexity, and truth-finding efficiency ⋮ Program Size Complexity of Correction Grammars in the Ershov Hierarchy ⋮ Parsimony hierarchies for inductive inference ⋮ On ordinal VC-dimension and some notions of complexity ⋮ Classes with easily learnable subclasses ⋮ On the classification of recursive languages ⋮ Recursion theoretic models of learning: Some results and intuitions ⋮ On the intrinsic complexity of learning recursive functions ⋮ Feasible Iteration of Feasible Learning Functionals ⋮ Learning with ordinal-bounded memory from positive data ⋮ Borel-Piecewise Continuous Reducibility for Uniformization Problems ⋮ Topological Properties of Concept Spaces ⋮ Dynamically Delayed Postdictive Completeness and Consistency in Learning ⋮ Learning by switching type of information. ⋮ Non-U-shaped vacillatory and team learning ⋮ Probabilistic and team PFIN-type learning: General properties ⋮ Uncomputability: The problem of induction internalized ⋮ Computable categoricity and the Ershov hierarchy ⋮ Topological properties of concept spaces (full version) ⋮ Mind change complexity of inferring unbounded unions of restricted pattern languages from positive data ⋮ Mind change efficient learning ⋮ Iterative learning of simple external contextual languages ⋮ Learning algebraic structures from text ⋮ Resource restricted computability theoretic learning: Illustrative topics and problems ⋮ Learning correction grammars ⋮ Rice and Rice-Shapiro Theorems for transfinite correction grammars ⋮ Robust learning aided by context ⋮ Incremental Learning with Ordinal Bounded Example Memory ⋮ Ordinal mind change complexity of language identification ⋮ The logic of reliable and efficient inquiry ⋮ Incremental concept learning for bounded data mining. ⋮ On a generalized notion of mistake bounds ⋮ Calculating the mind change complexity of learning algebraic structures ⋮ Mind change complexity of learning logic programs ⋮ Unifying logic, topology and learning in parametric logic
This page was built for publication: On the role of procrastination in machine learning