Decision theoretic generalizations of the PAC model for neural net and other learning applications

From MaRDI portal
Publication:1198550

DOI10.1016/0890-5401(92)90010-DzbMath0762.68050WikidataQ114953529 ScholiaQ114953529MaRDI QIDQ1198550

David Haussler

Publication date: 16 January 1993

Published in: Information and Computation (Search for Journal in Brave)




Related Items

Deep learning: a statistical viewpoint, Learning to Recognize Three-Dimensional Objects, Approximate Degree in Classical and Quantum Computing, Concept learning by example decomposition, Recurrent Neural Networks with Small Weights Implement Definite Memory Machines, Non-parametric regression for spatially dependent data with wavelets, Submodular Functions: Learnability, Structure, and Optimization, On approximately identifying concept classes in the limit, Nonlinear approximation of functions by sets of finite pseudo-dimension in the probabilistic and average case settings, Learning with risks based on M-location, Minimax rates for conditional density estimation via empirical entropy, Neural Networks with Local Receptive Fields and Superlinear VC Dimension, Robust Estimators in High-Dimensions Without the Computational Intractability, Unnamed Item, Finite sample properties of system identification of ARX models under mixing conditions, A Size-Sensitive Discrepancy Bound for Set Systems of Bounded Primal Shatter Dimension, Improved bounds on the sample complexity of learning, Agnostic learning of geometric patterns, Nonlinear approximations using sets of finite cardinality or finite pseudo-dimension, Randomized algorithms for robust controller synthesis using statistical learning theory, Unnamed Item, Nonparametric regression function estimation using interaction least squares splines and complexity regularization., Learning automata algorithms for pattern classification., A comparison of identification criteria for inductive inference of recursive real-valued functions, A PAC Approach to Application-Specific Algorithm Selection, Metric Entropy for Functions of Bounded Total Generalized Variation, A comparative study of multi‐class support vector machines in the unifying framework of large margin classifiers, Valid Generalisation from Approximate Interpolation, ESTIMATING THE APPROXIMATION ERROR IN LEARNING THEORY, Unnamed Item, Neural networks and some applications to finance, Learning in Repeated Auctions, Agnostically Learning Boolean Functions with Finite Polynomial Representation, Concentration inequalities, large and moderate deviations for self-normalized empirical processes, Efficient distribution-free learning of probabilistic concepts, Some connections between learning and optimization, Efficient algorithms for learning functions with bounded variation, Amplification of One-Way Information Complexity via Codes and Noise Sensitivity, Toward efficient agnostic learning, Sphere packing numbers for subsets of the Boolean \(n\)-cube with bounded Vapnik-Chervonenkis dimension, Sample Complexity Bounds on Differentially Private Learning via Communication Complexity, A counterexample concerning uniform ergodic theorems for a class of functions, Learning from a population of hypotheses, Non-linear approximation of functions with mixed smoothness by sets of finite pseudo-dimension, Complexity of hyperconcepts, Approximation and learning of convex superpositions, PALO: a probabilistic hill-climbing algorithm, On the complexity of learning from drifting distributions, On the value of partial information for learning from examples, Simulation-based optimization of Markov decision processes: an empirical process theory approach, On convergence proofs in system identification -- a general principle using ideas from learning theory, Orthogonal series estimates on strong spatial mixing data, Relation between weight size and degree of over-fitting in neural network regression, Learning cost-sensitive active classifiers, Optimal adaptive sampling recovery, Nonlinear orthogonal series estimates for random design regression, On the difficulty of approximately maximizing agreements., Bayesian-validated computer-simulation surrogates for optimization and design: Error estimates and applications, Learning half-spaces on general infinite spaces equipped with a distance function, Relative \((p,\varepsilon )\)-approximations in geometry, A Bayesian perspective of statistical machine learning for big data, Bracketing entropy and VC-dimension, A complete characterization of statistical query learning with applications to evolvability, Reliable agnostic learning, Learning with stochastic inputs and adversarial outputs, Least squares estimators of the regression function with twice censored data, Approximation with neural networks activated by ramp sigmoids, Incentive compatible regression learning, Relative deviation learning bounds and generalization with unbounded loss functions, An inequality for uniform deviations of sample averages from their means, Generalization ability of fractional polynomial models, On learning multicategory classification with sample queries., Learning $$AC^0$$ Under k-Dependent Distributions, Analysis of two gradient-based algorithms for on-line regression, Deep-Learning Solution to Portfolio Selection with Serially Dependent Returns, On-line learning of smooth functions of a single variable, Complexity of computing Vapnik-Chervonenkis dimension and some generalized dimensions, Rigorous learning curve bounds from statistical mechanics, Polynomial bounds for VC dimension of sigmoidal and general Pfaffian neural networks, Best lower bound on the probability of a binomial exceeding its expectation, Aspects of discrete mathematics and probability in the theory of machine learning, PAC-learning in the presence of one-sided classification~noise, On the mathematical foundations of learning, Surrogate losses in passive and active learning, On the Complexity of Computing and Learning with Multiplicative Neural Networks, Analysis of a multi-category classifier, Two proofs for shallow packings, Covering numbers for bounded variation functions, Randomized algorithms for robust controller synthesis using statistical learning theory: a tutorial overview, Learning big (image) data via coresets for dictionaries, Tight bounds on \(\ell_1\) approximation and learning of self-bounding functions, Multi-category classifiers and sample width, Theory of Classification: a Survey of Some Recent Advances, Estimation error analysis of deep learning on the regression problem on the variable exponent Besov space, Optimal Bounds on Approximation of Submodular and XOS Functions by Juntas, A graph-theoretic generalization of the Sauer-Shelah lemma, Scale-sensitive dimensions and skeleton estimates for classification, The degree of approximation of sets in euclidean space using sets with bounded Vapnik-Chervonenkis dimension, Variational Monte Carlo -- bridging concepts of machine learning and high-dimensional partial differential equations, Prediction, learning, uniform convergence, and scale-sensitive dimensions, Specification and simulation of statistical query algorithms for efficiency and noise tolerance, An introduction to some statistical aspects of PAC learning theory, Learning dynamical systems in a stationary environment, The complexity of model classes, and smoothing noisy data, A learning result for continuous-time recurrent neural networks, Learning Hurdles for Sleeping Experts, Evolvability of Real Functions, Sample complexity of model-based search, Universally consistent regression function estimation using hierarchical \(B\)-splines, Nonparametric estimation of piecewise smooth regression functions, Reconstructing Algebraic Functions from Mixed Data, A better approximation for balls, Exact lower bounds for the agnostic probably-approximately-correct (PAC) machine learning model, Approximation of Sobolev-type classes with quasi-seminorms, Inequalities for uniform deviations of averages from expectations with applications to nonparametric regression, Distribution-free consistency of empirical risk minimization and support vector regression, On weak base hypotheses and their implications for boosting regression and classification, Learning with queries corrupted by classification noise, On the learnability of rich function classes, On the orders of nonlinear approximations for classes of functions of given form, Sharp estimates for the covering numbers of the Weierstrass fractal kernel, On the complexity of learning for spiking neurons with temporal coding., Learning fixed-dimension linear thresholds from fragmented data, Probabilistic \(k\)-median clustering in data streams, Local Rademacher complexities, Prediction from randomly right censored data, Combinatorics and connectionism, A result of Vapnik with applications, Hardness results for neural network approximation problems, A new approach for learning belief networks using independence criteria, Primal and dual combinatorial dimensions, Rates of uniform convergence of empirical means with mixing processes, Maximizing agreements and coagnostic learning



Cites Work