Learning performance of Tikhonov regularization algorithm with geometrically beta-mixing observations
From MaRDI portal
Publication:619769
DOI10.1016/j.jspi.2010.07.011zbMath1206.62095MaRDI QIDQ619769
Rong Chen, Bin Zou, Zong Ben Xu
Publication date: 18 January 2011
Published in: Journal of Statistical Planning and Inference (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jspi.2010.07.011
Cites Work
- Unnamed Item
- Unnamed Item
- Markov chains and stochastic stability
- Regularized least square regression with dependent samples
- Learning from dependent observations
- Rates of convergence for empirical processes of stationary mixing sequences
- Computable bounds for geometric convergence rates of Markov chains
- A note on uniform laws of averages for dependent processes
- Learning and generalisation. With applications to neural networks.
- Best choices for regularization parameters in learning theory: on the bias-variance problem.
- Rates of uniform convergence of empirical means with mixing processes
- Learning rates of regularized regression for exponentially strongly mixing sequence
- The performance bounds of learning machines based on exponentially strongly mixing sequences
- Learning rates of least-square regularized regression
- On the mathematical foundations of learning
- Learning Theory
- Support Vector Machines
- Capacity of reproducing kernel spaces in learning theory
- ONLINE LEARNING WITH MARKOV SAMPLING
- Minimum complexity regression estimation with weakly dependent observations
- ESTIMATING THE APPROXIMATION ERROR IN LEARNING THEORY
- Probability Inequalities for Sums of Bounded Random Variables
- Learning Theory