On the role of procrastination in machine learning

From MaRDI portal
Revision as of 12:03, 31 January 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:1317427

DOI10.1006/inco.1993.1068zbMath0794.68127OpenAlexW1999736502MaRDI QIDQ1317427

Carl H. Smith, Rūsiņš Freivalds

Publication date: 17 April 1994

Published in: Information and Computation (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1006/inco.1993.1068




Related Items (39)

Generalized notions of mind change complexityCounting extensional differences in BC-learningOn the data consumption benefits of accepting increased uncertaintyInside the Muchnik degrees. I: Discontinuity, learnability and constructivismOckham's razor, empirical complexity, and truth-finding efficiencyProgram Size Complexity of Correction Grammars in the Ershov HierarchyParsimony hierarchies for inductive inferenceOn ordinal VC-dimension and some notions of complexityClasses with easily learnable subclassesOn the classification of recursive languagesRecursion theoretic models of learning: Some results and intuitionsOn the intrinsic complexity of learning recursive functionsFeasible Iteration of Feasible Learning FunctionalsLearning with ordinal-bounded memory from positive dataBorel-Piecewise Continuous Reducibility for Uniformization ProblemsTopological Properties of Concept SpacesDynamically Delayed Postdictive Completeness and Consistency in LearningLearning by switching type of information.Non-U-shaped vacillatory and team learningProbabilistic and team PFIN-type learning: General propertiesUncomputability: The problem of induction internalizedComputable categoricity and the Ershov hierarchyTopological properties of concept spaces (full version)Mind change complexity of inferring unbounded unions of restricted pattern languages from positive dataMind change efficient learningIterative learning of simple external contextual languagesLearning algebraic structures from textResource restricted computability theoretic learning: Illustrative topics and problemsLearning correction grammarsRice and Rice-Shapiro Theorems for transfinite correction grammarsRobust learning aided by contextIncremental Learning with Ordinal Bounded Example MemoryOrdinal mind change complexity of language identificationThe logic of reliable and efficient inquiryIncremental concept learning for bounded data mining.On a generalized notion of mistake boundsCalculating the mind change complexity of learning algebraic structuresMind change complexity of learning logic programsUnifying logic, topology and learning in parametric logic




This page was built for publication: On the role of procrastination in machine learning