Learning rates for kernel-based expectile regression
DOI10.1007/S10994-018-5762-9zbMATH Open1480.62067arXiv1702.07552OpenAlexW2592603639WikidataQ129210583 ScholiaQ129210583MaRDI QIDQ669274FDOQ669274
Authors: Ingo Steinwart, M. U. Farooq
Publication date: 15 March 2019
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1702.07552
Recommendations
Nonparametric regression and quantile regression (62G08) Applications of statistics to economics (62P20) Learning and adaptive systems in artificial intelligence (68T05) Limit theorems for vector-valued random variables (infinite-dimensional case) (60B12)
Cites Work
- Title not available (Why is that?)
- Regression Quantiles
- Title not available (Why is that?)
- Asymmetric Least Squares Estimation and Testing
- Support Vector Machines
- Adaptive Bayesian estimation using a Gaussian random field with inverse gamma bandwidth
- On the mathematical foundations of learning
- Relating quantiles and expectiles under weighted-symmetry
- Generalized quantiles as risk measures
- Coherence and elicitability
- Geoadditive expectile regression
- Asymmetric least squares regression estimation: A nonparametric approach∗
- Nonparametric multiple expectile regression via ER-Boost
- Title not available (Why is that?)
- A distribution-free theory of nonparametric regression
- Title not available (Why is that?)
- Optimal rates for the regularized least-squares algorithm
- Support vector machine soft margin classifiers: error analysis
- Title not available (Why is that?)
- Interpolation of Besov Spaces
- Besov Spaces on Domains in ℝ d
- Learning rates of least-square regularized regression
- Estimating conditional quantiles with the help of the pinball loss
- Statistical performance of support vector machines
- A new concentration result for regularized risk minimizers
- Model selection for regularized least-squares algorithm in learning theory
- On regularization algorithms in learning theory
- An introduction to Sobolev spaces and interpolation spaces
- Regularization in kernel learning
- Training SVMs without offset
- Optimal regression rates for SVMs using Gaussian kernels
- How to compare different loss functions and their risks
- Maximum-gain working set selection for SVMs
- Title not available (Why is that?)
- Oracle inequalities for support vector machines that are based on random entropy numbers
- An SVM-like approach for expectile regression
- Nonlinear expectile regression with application to value-at-risk and expected shortfall estimation
- A dynamic autoregressive expectile for time-invariant portfolio protection strategies
- Title not available (Why is that?)
- Nonparametric regression expectiles∗
- Nonparametric conditional autoregressive expectile model via neural network with applications to estimating financial risk
Cited In (13)
- Nonparametric estimation of expectile regression in functional dependent data
- Local linear estimate of the functional expectile regression
- Optimal learning rates for kernel partial least squares
- A closer look at covering number bounds for Gaussian kernels
- Learning rates for the kernel regularized regression with a differentiable strongly convex loss
- Title not available (Why is that?)
- Connection between higher order measures of risk and stochastic dominance
- Analysis of regularized least-squares in reproducing kernel Kreĭn spaces
- The local linear functional \(k\)NN estimator of the conditional expectile: uniform consistency in number of neighbors
- Kernel estimation under linear-exponential loss
- Adaptive learning rates for support vector machines working on data with low intrinsic dimension
- The functional \(k\mathrm{NN}\) estimator of the conditional expectile: uniform consistency in number of neighbors
- Optimal learning with Gaussians and correntropy loss
Uses Software
This page was built for publication: Learning rates for kernel-based expectile regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q669274)