Learning rates for kernel-based expectile regression
From MaRDI portal
(Redirected from Publication:669274)
Abstract: Conditional expectiles are becoming an increasingly important tool in finance as well as in other areas of applications. We analyse a support vector machine type approach for estimating conditional expectiles and establish learning rates that are minimax optimal modulo a logarithmic factor if Gaussian RBF kernels are used and the desired expectile is smooth in a Besov sense. As a special case, our learning rates improve the best known rates for kernel-based least squares regression in this scenario. Key ingredients of our statistical analysis are a general calibration inequality for the asymmetric least squares loss, a corresponding variance bound as well as an improved entropy number bound for Gaussian RBF kernels.
Recommendations
Cites work
- scientific article; zbMATH DE number 6378052 (Why is no real title available?)
- scientific article; zbMATH DE number 5957364 (Why is no real title available?)
- scientific article; zbMATH DE number 1215245 (Why is no real title available?)
- scientific article; zbMATH DE number 775727 (Why is no real title available?)
- scientific article; zbMATH DE number 2208228 (Why is no real title available?)
- scientific article; zbMATH DE number 5272875 (Why is no real title available?)
- A distribution-free theory of nonparametric regression
- A dynamic autoregressive expectile for time-invariant portfolio protection strategies
- A new concentration result for regularized risk minimizers
- Adaptive Bayesian estimation using a Gaussian random field with inverse gamma bandwidth
- An SVM-like approach for expectile regression
- An introduction to Sobolev spaces and interpolation spaces
- Asymmetric Least Squares Estimation and Testing
- Asymmetric least squares regression estimation: A nonparametric approach∗
- Besov Spaces on Domains in ℝ d
- Coherence and elicitability
- Estimating conditional quantiles with the help of the pinball loss
- Generalized quantiles as risk measures
- Geoadditive expectile regression
- How to compare different loss functions and their risks
- Interpolation of Besov Spaces
- Learning rates of least-square regularized regression
- Maximum-gain working set selection for SVMs
- Model selection for regularized least-squares algorithm in learning theory
- Nonlinear expectile regression with application to value-at-risk and expected shortfall estimation
- Nonparametric conditional autoregressive expectile model via neural network with applications to estimating financial risk
- Nonparametric multiple expectile regression via ER-Boost
- Nonparametric regression expectiles∗
- On regularization algorithms in learning theory
- On the mathematical foundations of learning
- Optimal learning rates for localized SVMs
- Optimal rates for the regularized least-squares algorithm
- Optimal regression rates for SVMs using Gaussian kernels
- Oracle inequalities for support vector machines that are based on random entropy numbers
- Regression Quantiles
- Regularization in kernel learning
- Relating quantiles and expectiles under weighted-symmetry
- Statistical performance of support vector machines
- Support Vector Machines
- Support vector machine soft margin classifiers: error analysis
- Training SVMs without offset
Cited in
(16)- Local linear estimate of the functional expectile regression
- The local linear functional \(k\)NN estimator of the conditional expectile: uniform consistency in number of neighbors
- Learning rates for the risk of kernel-based quantile regression estimators in additive models
- Connection between higher order measures of risk and stochastic dominance
- Sobolev norm learning rates for regularized least-squares algorithms
- Optimal learning with Gaussians and correntropy loss
- Kernel estimation under linear-exponential loss
- Optimal learning rates for kernel partial least squares
- Adaptive learning rates for support vector machines working on data with low intrinsic dimension
- The functional \(k\mathrm{NN}\) estimator of the conditional expectile: uniform consistency in number of neighbors
- Nonparametric estimation of expectile regression in functional dependent data
- An SVM-like approach for expectile regression
- KLERC: kernel Lagrangian expectile regression calculator
- Analysis of regularized least-squares in reproducing kernel Kreĭn spaces
- Learning rates for the kernel regularized regression with a differentiable strongly convex loss
- A closer look at covering number bounds for Gaussian kernels
This page was built for publication: Learning rates for kernel-based expectile regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q669274)