Estimation of dependences based on empirical data. Transl. from the Russian by Samuel Kotz

From MaRDI portal
Publication:1171840


zbMath0499.62005MaRDI QIDQ1171840

Vladimir Vapnik

Publication date: 1982

Published in: Springer Series in Statistics (Search for Journal in Brave)


62H12: Estimation in multivariate analysis

62G05: Nonparametric estimation

62-02: Research exposition (monographs, survey articles) pertaining to statistics

62C12: Empirical decision procedures; empirical Bayes procedures


Related Items

A solution of the filtering and smoothing problems for uncertain-stochastic linear dynamic systems, Stabilized Reconstruction in Signal and Image Processing, Valid Generalisation from Approximate Interpolation, Ten More Years of Error Rate Research, Geometric Representation of High Dimension, Low Sample Size Data, A tutorial on ν‐support vector machines, Finite sample properties of system identification of ARX models under mixing conditions, Non- and semiparametric statistics: compared and contrasted, Some applications of concentration inequalities to statistics, Gaining degrees of freedom in subsymbolic learning, Least third-order cumulant method with adaptive regularization parameter selection for neural networks, Randomized algorithms for robust controller synthesis using statistical learning theory, Probabilistic solutions to some NP-hard matrix problems, Learning automata algorithms for pattern classification., Complexity of learning in artificial neural networks, Density estimation by the penalized combinatorial method, Monte Carlo algorithms for optimal stopping and statistical learning, Rigorous learning curve bounds from statistical mechanics, Neural networks with quadratic VC dimension, Asymptotic behavior of solutions of some classes of stochastic equations and their applications to statistical problems, Results on learnability and the Vapnik-Chervonenkis dimension, Learning and generalization errors for the 2D binary perceptron., Optimizing resources in model selection for support vector machine, Risk bounds for statistical learning, Creating a quality map of a slate deposit using support vector machines, Kernel methods in machine learning, Locally linear reconstruction for instance-based learning, Bayesian approach, theory of empirical risk minimization. Comparative analysis, Estimating a density and its derivatives via the minimum distance method, On the asymptotic properties of smoothed estimators of the classification error rate, Quantifying inductive bias: AI learning algorithms and Valiant's learning framework, Learning metric-topological maps for indoor mobile robot navigation, On convergence proofs in system identification -- a general principle using ideas from learning theory, Theoretical aspects of ill-posed problems in statistics, Equivalence of models for polynomial learnability, Decision theoretic generalizations of the PAC model for neural net and other learning applications, Scale-sensitive dimensions and skeleton estimates for classification, The degree of approximation of sets in euclidean space using sets with bounded Vapnik-Chervonenkis dimension, An introduction to some statistical aspects of PAC learning theory, A learning result for continuous-time recurrent neural networks, Sample size lower bounds in PAC learning by Algorithmic Complexity Theory, On the learnability of rich function classes, Combinatorics and connectionism, A result of Vapnik with applications, Efficient distribution-free learning of probabilistic concepts, Three fundamental concepts of the capacity of learning machines, A review of combinatorial problems arising in feedforward neural network design, An investigation on the conditions of pruning an induced decision tree, Vapnik-Chervonenkis dimension and (pseudo-)hyperplane arrangements, Identificaction of nonlinear block-oriented systems by the recursive kernel estimate, Toward efficient agnostic learning, A theory for memory-based learning, Sphere packing numbers for subsets of the Boolean \(n\)-cube with bounded Vapnik-Chervonenkis dimension, Search for the best decision rules with the help of a probabilistic estimate, A decision-theoretic generalization of on-line learning and an application to boosting, Approximation and learning of convex superpositions, On the value of partial information for learning from examples, PAC learning of concept classes through the boundaries of their items, Nonlinear orthogonal series estimates for random design regression, Constructing fixed rank optimal estimators with method of best recurrent approximations, A note on different covering numbers in learning theory., Results in statistical discriminant analysis: A review of the former Soviet Union literature., On learning multicategory classification with sample queries., Formal methods in pattern recognition: A review, Inequalities for uniform deviations of averages from expectations with applications to nonparametric regression, A geometric approach to leveraging weak learners, The complexity of theory revision, A unified treatment of direct and indirect estimation of a probability density and its derivatives, Optimal recursive estimation of raw data, Arcing classifiers. (With discussion), A general lower bound on the number of examples needed for learning, The Vapnik-Chervonenkis dimension of a random graph, Improved lower bounds for learning from noisy examples: An information-theoretic approach, Optimal aggregation of classifiers in statistical learning., On data classification by iterative linear partitioning, Some connections between learning and optimization, A counterexample concerning uniform ergodic theorems for a class of functions, Fusion methods for multiple sensor systems with unknown error densities, A generalization of Sauer's lemma, Learning from a population of hypotheses, An approach to guided learning of Boolean functions, Accuracy of techniques for the logical analysis of data, An inequality for uniform deviations of sample averages from their means, \(P\)-sufficient statistics for PAC learning \(k\)-term-DNF formulas through enumeration, Target differentiation with simple infrared sensors using statistical pattern recognition techniques, Model selection by bootstrap penalization for classification, Guest editorial: Learning theory, Quadratic boosting, A general soft method for learning SVM classifiers with \(L_{1}\)-norm penalty, Aspects of discrete mathematics and probability in the theory of machine learning, Computational intelligence in earth sciences and environmental applications: issues and challenges., New algorithms for singly linearly constrained quadratic programs subject to lower and upper bounds, Universal consistency of delta estimators, Adaptive regression estimation with multilayer feedforward neural networks, Theory of Classification: a Survey of Some Recent Advances, SHOCK PHYSICS DATA RECONSTRUCTION USING SUPPORT VECTOR REGRESSION, Analysis to Neyman-Pearson classification with convex loss function, Well-posed linear models for large-scale zero-memory composite systems