Learning theory estimates with observations from general stationary stochastic processes
DOI10.1162/NECO_A_00870zbMATH Open1476.68229arXiv1605.02887OpenAlexW2963651118WikidataQ50499957 ScholiaQ50499957MaRDI QIDQ5380606FDOQ5380606
Authors: Hanyuan Hang, Yun-Long Feng, Ingo Steinwart, Johan A. K. Suykens
Publication date: 5 June 2019
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1605.02887
Recommendations
- Fast learning from \(\alpha\)-mixing observations
- A Bernstein-type inequality for some mixing processes and dynamical systems with an application to learning
- New Bernstein's inequalities for dependent observations and applications to learning theory
- Learning from dependent observations
- Stability bounds for stationary \(\varphi \)-mixing and \(\beta \)-mixing processes
Nonparametric estimation (62G05) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05) Stationary stochastic processes (60G10)
Cites Work
- Stability bounds for stationary \(\varphi \)-mixing and \(\beta \)-mixing processes
- Some Limit Theorems for Stationary Processes
- Nonlinear time series. Nonparametric and parametric methods
- Support Vector Machines
- Title not available (Why is that?)
- Convergence of Distributions Generated by Stationary Stochastic Processes
- Correlation theory of stationary and related random functions. Volume II: Supplementary notes and references
- Title not available (Why is that?)
- Title not available (Why is that?)
- Basic properties of strong mixing conditions. A survey and some open questions
- A distribution-free theory of nonparametric regression
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- A CENTRAL LIMIT THEOREM AND A STRONG MIXING CONDITION
- Title not available (Why is that?)
- Bernstein inequality and moderate deviations under strong mixing conditions
- Some Limit Theorems for Random Functions. I
- Introduction to strong mixing conditions. Vol. 1.
- Combinatorial methods in density estimation
- A tail inequality for suprema of unbounded empirical processes with applications to Markov chains
- Optimal aggregation of classifiers in statistical learning.
- Ergodic mirror descent
- Estimating conditional quantiles with the help of the pinball loss
- Rates of convergence for empirical processes of stationary mixing sequences
- A Bernstein-type inequality for some mixing processes and dynamical systems with an application to learning
- A note on application of integral operator in learning theory
- Minimum complexity regression estimation with weakly dependent observations
- Regularization in kernel learning
- Regularized least square regression with dependent samples
- Learning from dependent observations
- Concentration of measure inequalities for Markov chains and \(\Phi\)-mixing processes.
- Learning and generalisation. With applications to neural networks.
- Least-squares regularized regression with dependent samples and \(q\)-penalty
- Learning rates of regularized regression for exponentially strongly mixing sequence
- Bernstein-type large deviations inequalities for partial sums of strong mixing processes
- EXPONENTIAL INEQUALITIES AND FUNCTIONAL ESTIMATIONS FOR WEAK DEPENDENT DATA: APPLICATIONS TO DYNAMICAL SYSTEMS
- An Explicit Description of the Reproducing Kernel Hilbert Spaces of Gaussian RBF Kernels
- Two oracle inequalities for regularized boosting classifiers
- Model selection for weakly dependent time series forecasting
- 10.1162/1532443041424319
- Optimal regression rates for SVMs using Gaussian kernels
- Spectral estimation of the Lévy density in partially observed affine models
- Some Limit Theorems for Random Functions. II
- Title not available (Why is that?)
- Limits to classification and regression estimation from ergodic processes
- Fast learning from \(\alpha\)-mixing observations
- Mathematical statistics and stochastic processes
Cited In (12)
- Statistical learning based on Markovian data maximal deviation inequalities and learning rates
- Generalization bounds for non-stationary mixing processes
- Stability bounds for stationary \(\varphi \)-mixing and \(\beta \)-mixing processes
- Spectral algorithms for learning with dependent observations
- Prediction of dynamical time series using kernel based regression and smooth splines
- Risk bounds of learning processes for Lévy processes
- Title not available (Why is that?)
- Estimation and asymptotic properties of a stationary univariate GARCH(\(p,q\)) process
- A class of learning/estimation algorithms using nominal values: Asymptotic analysis and applications
- A Bernstein-type inequality for some mixing processes and dynamical systems with an application to learning
- Fast learning from \(\alpha\)-mixing observations
- Learning from dependent observations
This page was built for publication: Learning theory estimates with observations from general stationary stochastic processes
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5380606)