Measuring the Capacity of Sets of Functions in the Analysis of ERM
From MaRDI portal
Publication:2805728
DOI10.1007/978-3-319-21852-6_16zbMath1336.68223OpenAlexW2289593790MaRDI QIDQ2805728
Publication date: 13 May 2016
Published in: Measures of Complexity (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/978-3-319-21852-6_16
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Estimating conditional quantiles with the help of the pinball loss
- Deviation inequalities for sums of weakly dependent time series
- Optimal concentration inequalities for dynamical systems
- Oracle inequalities for support vector machines that are based on random entropy numbers
- Learning from dependent observations
- Uniform convergence of Vapnik-Chervonenkis classes under ergodic sampling
- Fast rates for support vector machines using Gaussian kernels
- Consistency of support vector machines for forecasting the evolution of an unknown ergodic dynamical system from observations with unknown noise
- Gelfand numbers of operators with values in a Hilbert space
- Consistent nonparametric regression. Discussion
- Rates of convergence for empirical processes of stationary mixing sequences
- Sphere packing numbers for subsets of the Boolean \(n\)-cube with bounded Vapnik-Chervonenkis dimension
- Entropy and the combinatorial dimension
- A Bennett concentration inequality and its application to suprema of empirical processes
- New dependence coefficients. Examples and applications to statistics
- A distribution-free theory of nonparametric regression
- Optimal aggregation of classifiers in statistical learning.
- Weak convergence and empirical processes. With applications to statistics
- The generalization performance of ERM algorithm with strongly mixing observations
- Statistical performance of support vector machines
- Fast learning from \(\alpha\)-mixing observations
- The performance bounds of learning machines based on exponentially strongly mixing sequences
- Theory of Classification: a Survey of Some Recent Advances
- Exponential inequalities and estimation of conditional probabilities
- Support Vector Machines
- Any Discrimination Rule Can Have an Arbitrarily Bad Probability of Error for Finite Sample Size
- Uniform Central Limit Theorems
- Minimum complexity regression estimation with weakly dependent observations
- Scale-sensitive dimensions, uniform convergence, and learnability
- Improving the sample complexity using global data
- Neural Network Learning
- Convexity, Classification, and Risk Bounds
- Convergence of stochastic processes
- Combinatorial methods in density estimation
- New concentration inequalities in product spaces
This page was built for publication: Measuring the Capacity of Sets of Functions in the Analysis of ERM