Least-square regularized regression with non-iid sampling
DOI10.1016/J.JSPI.2009.04.007zbMATH Open1176.68163OpenAlexW2042138716MaRDI QIDQ2272113FDOQ2272113
Authors: Zhi-Wei Pan, Quan-Wu Xiao
Publication date: 5 August 2009
Published in: Journal of Statistical Planning and Inference (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jspi.2009.04.007
Recommendations
- Coefficient regularized regression with non-iid sampling
- Regularized least square regression with dependent samples
- Regularized least square regression with unbounded and dependent sampling
- Regression learning with non-identically and non-independently sampling
- Consistent least squares nonparametric regression
- Least square regression with \(l^{p}\)-coefficient regularization
- Least-squares regularized regression with dependent samples and \(q\)-penalty
- Least-square estimation for regression on random designs for absolutely regular observations
- Nonparametric regression estimation using penalized least squares
reproducing kernel Hilbert spacestrong mixing conditionleast-square regularized regressionsampling with non-identical distributions
General nonlinear regression (62J02) Learning and adaptive systems in artificial intelligence (68T05)
Cites Work
- Title not available (Why is that?)
- 10.1162/153244302760200704
- The Invariance Principle for Stationary Processes
- Optimal rates for the regularized least-squares algorithm
- Shannon sampling and function reconstruction from point values
- Leave-One-Out Bounds for Kernel Methods
- Shannon sampling. II: Connections to learning theory
- The covering number in learning theory
- Learning rates of least-square regularized regression
- Learning theory estimates via integral operators and their approximations
- Capacity of reproducing kernel spaces in learning theory
- Almost sure invariance principles for weakly dependent vector-valued random variables
- Model selection for regularized least-squares algorithm in learning theory
- A note on application of integral operator in learning theory
- ONLINE LEARNING WITH MARKOV SAMPLING
- Minimum complexity regression estimation with weakly dependent observations
- Regularized least square regression with dependent samples
- Learning from dependent observations
- Learning rates of regularized regression for exponentially strongly mixing sequence
- High order Parzen windows and randomized sampling
Cited In (18)
- Convergence rate for the moving least-squares learning with dependent sampling
- Error analysis for \(l^q\)-coefficient regularized moving least-square regression
- Regularized least-squares regression: learning from a sequence
- Least-squares regularized regression with dependent samples and \(q\)-penalty
- The consistency of least-square regularized regression with negative association sequence
- Coefficient regularized regression with non-iid sampling
- Analysis of regularized Nyström subsampling for regression functions of low smoothness
- Error analysis of the moving least-squares regression learning algorithm with β-mixing and non-identical sampling
- Learning performance of regularized regression with multiscale kernels based on Markov observations
- Concentration estimates for learning with unbounded sampling
- Learning from non-random data in Hilbert spaces: an optimal recovery perspective
- Regression learning with non-identically and non-independently sampling
- Regularized least square regression with unbounded and dependent sampling
- Error analysis of the moving least-squares method with non-identical sampling
- Optimal learning rates for distribution regression
- Classification with non-i.i.d. sampling
- Regularized least square regression with dependent samples
- Fast learning from \(\alpha\)-mixing observations
This page was built for publication: Least-square regularized regression with non-iid sampling
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2272113)