An oracle inequality for regularized risk minimizers with strongly mixing observations
From MaRDI portal
Publication:373424
DOI10.1007/s11464-013-0247-4zbMath1296.62138OpenAlexW2002831319MaRDI QIDQ373424
Publication date: 22 October 2013
Published in: Frontiers of Mathematics in China (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11464-013-0247-4
Computational learning theory (68Q32) Inequalities; stochastic orderings (60E15) General nonlinear regression (62J02) Learning and adaptive systems in artificial intelligence (68T05)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Learning from non-identical sampling for classification
- Learning from dependent observations
- Fast rates for support vector machines using Gaussian kernels
- Rates of convergence for empirical processes of stationary mixing sequences
- Nonparametric time series prediction through adaptive model selection
- Rates of uniform convergence of empirical means with mixing processes
- New approaches to statistical learning theory
- On the mathematical foundations of learning
- INDEFINITE KERNEL NETWORK WITH DEPENDENT SAMPLING
- ONLINE REGRESSION WITH VARYING GAUSSIANS AND NON-IDENTICAL DISTRIBUTIONS
- A CENTRAL LIMIT THEOREM AND A STRONG MIXING CONDITION
- Learning Theory
- Capacity of reproducing kernel spaces in learning theory
- ONLINE LEARNING WITH MARKOV SAMPLING
- Minimum complexity regression estimation with weakly dependent observations
- SVM Soft Margin Classifiers: Linear Programming versus Quadratic Programming
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
- Theory of Reproducing Kernels
This page was built for publication: An oracle inequality for regularized risk minimizers with strongly mixing observations