scientific article; zbMATH DE number 1950575
From MaRDI portal
Publication:4413261
zbMATH Open1019.68093MaRDI QIDQ4413261FDOQ4413261
Authors: Shahar Mendelson
Publication date: 17 July 2003
Full work available at URL: http://link.springer.de/link/service/series/0558/bibs/2600/26000001.htm
Title of this publication is not available (Why is that?)
Recommendations
Statistical aspects of information-theoretic topics (62B10) Learning and adaptive systems in artificial intelligence (68T05) Inequalities; stochastic orderings (60E15) Signal detection and filtering (aspects of stochastic processes) (60G35)
Cited In (38)
- Learning bounds of ERM principle for sequences of time-dependent samples
- Approximating the covariance ellipsoid
- Integer cells in convex sets
- Title not available (Why is that?)
- Some notes on perceptron learning
- The shattering dimension of sets of linear functionals.
- Empirical minimization
- Dimension reduction by random hyperplane tessellations
- Consistency of spectral clustering
- Learning theory: stability is sufficient for generalization and necessary and sufficient for consistency of empirical risk minimization
- Sequential complexities and uniform martingale laws of large numbers
- Convergence of the weighted nonlocal Laplacian on random point cloud
- Imaging conductivity from current density magnitude using neural networks
- On the mathematical foundations of learning
- Complexity regularization via localized random penalties
- Obtaining fast error rates in nonconvex situations
- On the geometry of polytopes generated by heavy-tailed random vectors
- Monte Carlo algorithms for optimal stopping and statistical learning
- Sampling discretization error of integral norms for function classes
- Foundations of support constraint machines
- Oracle inequalities for support vector machines that are based on random entropy numbers
- An axiomatic approach to intrinsic dimension of a dataset
- Solving PDEs on spheres with physics-informed convolutional neural networks
- A refined Hoeffding's upper tail probability bound for sum of independent random variables
- Robust support vector machines for classification with nonconvex and smooth losses
- Regularized learning schemes in feature Banach spaces
- Generalization error of minimum weighted norm and kernel interpolation
- Sampling discretization and related problems
- A recursive procedure for density estimation on the binary hypercube
- Refined Rademacher chaos complexity bounds with applications to the multikernel learning problem
- Unregularized online learning algorithms with general loss functions
- Title not available (Why is that?)
- Kernel methods in machine learning
- Asymptotic properties of neural network sieve estimators
- Estimation in high dimensions: a geometric perspective
- On the optimality of sample-based estimates of the expectation of the empirical minimizer
- Aspects of discrete mathematics and probability in the theory of machine learning
- Theory of Classification: a Survey of Some Recent Advances
This page was built for publication:
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4413261)