Online regression with varying Gaussians and non-identical distributions
DOI10.1142/S0219530511001923zbMATH Open1253.68189OpenAlexW2040468026MaRDI QIDQ3096972FDOQ3096972
Authors: Ting Hu
Publication date: 15 November 2011
Published in: Analysis and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1142/s0219530511001923
Recommendations
reproducing kernel Hilbert spaceonline learningGaussian kernelconvex loss functionregression algorithmvariance of Gaussian
Learning and adaptive systems in artificial intelligence (68T05) Online algorithms; streaming algorithms (68W27) Computational learning theory (68Q32)
Cites Work
- Online learning algorithms
- On the Generalization Ability of On-Line Learning Algorithms
- ESTIMATING THE APPROXIMATION ERROR IN LEARNING THEORY
- ONLINE LEARNING WITH MARKOV SAMPLING
- Online Regularized Classification Algorithms
- An Explicit Description of the Reproducing Kernel Hilbert Spaces of Gaussian RBF Kernels
- Fully online classification by regularization
- Derivative reproducing properties for kernel methods in learning theory
- Online classification with varying Gaussians
Cited In (28)
- Learning rates for the risk of kernel-based quantile regression estimators in additive models
- Online learning for quantile regression and support vector regression
- Strongly consistent online forecasting of centered Gaussian processes
- Online classification with varying Gaussians
- Learning rates for classification with Gaussian kernels
- Re-adapting the regularization of weights for non-stationary regression
- On-Line Estimation with the Multivariate Gaussian Distribution
- Streaming kernel regression with provably adaptive mean, variance, and regularization
- Concentration estimates for the moving least-square method in learning theory
- Stability analysis of learning algorithms for ontology similarity computation
- Approximation analysis of learning algorithms for support vector regression and quantile regression
- The multivariate Révész's online estimator of a regression function and its averaging
- Indefinite kernel network with dependent sampling
- Online learning with samples drawn from non-identical distributions
- Second-order non-stationary online learning for regression
- Regularization schemes for minimum error entropy principle
- Title not available (Why is that?)
- Unregularized online algorithms with varying Gaussians
- The optimal solution of multi-kernel regularization learning
- Kernel-based online gradient descent using distributed approach
- Causal regression for online estimation of highly nonlinear parametrically varying models
- Online Covariance Matrix Estimation in Stochastic Gradient Descent
- Quantile regression with samples drawn from non-identical distributions
- Online regularized pairwise learning with non-i.i.d. observations
- An oracle inequality for regularized risk minimizers with strongly mixing observations
- Online Bayesian max-margin subspace learning for multi-view classification and regression
- Learning Rates of lq Coefficient Regularization Learning with Gaussian Kernel
- Approximation by multivariate Bernstein-Durrmeyer operators and learning rates of least-squares regularized regression with multivariate polynomial kernels
This page was built for publication: Online regression with varying Gaussians and non-identical distributions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3096972)