Learning rates for kernel-based expectile regression
From MaRDI portal
Publication:669274
DOI10.1007/s10994-018-5762-9zbMath1480.62067arXiv1702.07552OpenAlexW2592603639WikidataQ129210583 ScholiaQ129210583MaRDI QIDQ669274
Ingo Steinwart, Muhammad Farooq
Publication date: 15 March 2019
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1702.07552
Applications of statistics to economics (62P20) Nonparametric regression and quantile regression (62G08) Learning and adaptive systems in artificial intelligence (68T05) Limit theorems for vector-valued random variables (infinite-dimensional case) (60B12)
Related Items (9)
Learning rates for the kernel regularized regression with a differentiable strongly convex loss ⋮ A closer look at covering number bounds for Gaussian kernels ⋮ Analysis of regularized least-squares in reproducing kernel Kreĭn spaces ⋮ Adaptive learning rates for support vector machines working on data with low intrinsic dimension ⋮ The functional \(k\mathrm{NN}\) estimator of the conditional expectile: uniform consistency in number of neighbors ⋮ Unnamed Item ⋮ Optimal learning with Gaussians and correntropy loss ⋮ Local linear estimate of the functional expectile regression ⋮ Nonparametric estimation of expectile regression in functional dependent data
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Asymmetric Least Squares Estimation and Testing
- Geoadditive expectile regression
- Estimating conditional quantiles with the help of the pinball loss
- Oracle inequalities for support vector machines that are based on random entropy numbers
- Model selection for regularized least-squares algorithm in learning theory
- Adaptive Bayesian estimation using a Gaussian random field with inverse gamma bandwidth
- Regularization in kernel learning
- On regularization algorithms in learning theory
- An introduction to Sobolev spaces and interpolation spaces
- An SVM-like approach for expectile regression
- Nonlinear expectile regression with application to value-at-risk and expected shortfall estimation
- A distribution-free theory of nonparametric regression
- Relating quantiles and expectiles under weighted-symmetry
- Optimal regression rates for SVMs using Gaussian kernels
- A dynamic autoregressive expectile for time-invariant portfolio protection strategies
- Generalized quantiles as risk measures
- Optimal rates for the regularized least-squares algorithm
- Statistical performance of support vector machines
- Learning rates of least-square regularized regression
- How to compare different loss functions and their risks
- On the mathematical foundations of learning
- COHERENCE AND ELICITABILITY
- Nonparametric regression expectiles∗
- Support Vector Machines
- A new concentration result for regularized risk minimizers
- Interpolation of Besov Spaces
- Besov Spaces on Domains in ℝ d
- Regression Quantiles
- Asymmetric least squares regression estimation: A nonparametric approach∗
- Nonparametric conditional autoregressive expectile model via neural network with applications to estimating financial risk
- Nonparametric multiple expectile regression via ER-Boost
This page was built for publication: Learning rates for kernel-based expectile regression