Estimating conditional quantiles with the help of the pinball loss
From MaRDI portal
Publication:637098
DOI10.3150/10-BEJ267zbMATH Open1284.62235arXiv1102.2101WikidataQ59196384 ScholiaQ59196384MaRDI QIDQ637098FDOQ637098
Authors: Ingo Steinwart, Andreas Christmann
Publication date: 2 September 2011
Published in: Bernoulli (Search for Journal in Brave)
Abstract: The so-called pinball loss for estimating conditional quantiles is a well-known tool in both statistics and machine learning. So far, however, only little work has been done to quantify the efficiency of this tool for nonparametric approaches. We fill this gap by establishing inequalities that describe how close approximate pinball risk minimizers are to the corresponding conditional quantile. These inequalities, which hold under mild assumptions on the data-generating distribution, are then used to establish so-called variance bounds, which recently turned out to play an important role in the statistical analysis of (regularized) empirical risk minimization approaches. Finally, we use both types of inequalities to establish an oracle inequality for support vector machines that use the pinball loss. The resulting learning rates are min--max optimal under some standard regularity assumptions on the conditional quantile.
Full work available at URL: https://arxiv.org/abs/1102.2101
Recommendations
- Quantile estimators with orthogonal pinball loss function
- Conditional quantile estimation through optimal quantization
- Conditional quantile estimation based on optimal quantization: from theory to practice
- Conditional quantile estimation using optimal quantization: a numerical study
- Quasi-maximum likelihood estimation for conditional quantiles
- Estimation of conditional quantile density function
- Non-parametric estimation of conditional quantiles
- Conditional empirical likelihood for quantile regression models
- Estimation of high conditional quantiles for heavy-tailed distributions
Nonparametric estimation (62G05) Asymptotic properties of nonparametric inference (62G20) Classification and discrimination; cluster analysis (statistical aspects) (62H30)
Cites Work
- Quantile regression.
- Learning Theory
- Support Vector Machines
- Smooth discrimination analysis
- Local Rademacher complexities
- Title not available (Why is that?)
- Some applications of concentration inequalities to statistics
- Optimal rates for the regularized least-squares algorithm
- Title not available (Why is that?)
- Title not available (Why is that?)
- Convexity, Classification, and Risk Bounds
- Title not available (Why is that?)
- Optimal aggregation of classifiers in statistical learning.
- PIECEWISE-POLYNOMIAL APPROXIMATIONS OF FUNCTIONS OF THE CLASSES $ W_{p}^{\alpha}$
- Measure and integration theory. Transl. from the German by Robert B. Burckel
- Statistical performance of support vector machines
- Information-theoretic determination of minimax rates of convergence
- On consistency and robustness properties of support vector machines for heavy-tailed distributions
- Regularization in kernel learning
- Bi-level path following for cross validated solution of kernel quantile regression
- Optimal estimators in learning theory
- How to compare different loss functions and their risks
- Title not available (Why is that?)
- Title not available (Why is that?)
- Oracle inequalities for support vector machines that are based on random entropy numbers
Cited In (51)
- Large margin unified machines with non-i.i.d. process
- Conformalized-DeepONet: a distribution-free framework for uncertainty quantification in deep operator networks
- Kernel-based sparse regression with the correntropy-induced loss
- Convergence rate of SVM for kernel-based robust regression
- A simpler approach to coefficient regularized support vector machines regression
- Calibration of \(\epsilon\)-insensitive loss in support vector machines regression
- Online learning for quantile regression and support vector regression
- An SVM-like approach for expectile regression
- Asymmetric least squares support vector machine classifiers
- Asymmetric \(\nu\)-tube support vector regression
- Oracle inequalities for sparse additive quantile regression in reproducing kernel Hilbert space
- Robust kernel-based distribution regression
- Mitigating robust overfitting via self-residual-calibration regularization
- Measuring the capacity of sets of functions in the analysis of ERM
- Sparse additive machine with ramp loss
- Learning rates for the kernel regularized regression with a differentiable strongly convex loss
- Statistical consistency of coefficient-based conditional quantile regression
- A new support vector machine plus with pinball loss
- Separability of reproducing kernel spaces
- Approximation analysis of learning algorithms for support vector regression and quantile regression
- Quantile regression with \(\ell_1\)-regularization and Gaussian kernels
- Analysis of regression algorithms with unbounded sampling
- A unified penalized method for sparse additive quantile models: an RKHS approach
- Title not available (Why is that?)
- Quantile estimators with orthogonal pinball loss function
- Title not available (Why is that?)
- Robust support vector quantile regression with truncated pinball loss (RSVQR)
- Stochastic online convex optimization. Application to probabilistic time series forecasting
- A robust regression framework with Laplace kernel-induced loss
- Conditional quantiles with varying Gaussians
- An introduction to copula-based bivariate reliability concepts
- On Lagrangian L2-norm pinball twin bounded support vector machine via unconstrained convex minimization
- Nonparametric estimation of a maximum of quantiles
- Coefficient-based regularization network with variance loss for error
- Adaptive learning rates for support vector machines working on data with low intrinsic dimension
- Perturbation of convex risk minimization and its application in differential private learning algorithms
- Optimal regression rates for SVMs using Gaussian kernels
- A new comparison theorem on conditional quantiles
- Approximation on variable exponent spaces by linear integral operators
- Estimation of quantile oriented sensitivity indices
- Quantitative convergence analysis of kernel based large-margin unified machines
- Moving quantile regression
- Analysis of approximation by linear operators on variable \(L_\rho^{p(\cdot)}\) spaces and applications in learning theory
- Error analysis of classification learning algorithms based on LUMs loss
- Learning with varying insensitive loss
- Sparse online regression algorithm with insensitive loss functions
- Learning rates for kernel-based expectile regression
- Fast learning from \(\alpha\)-mixing observations
- Conformal Prediction: A Gentle Introduction
- Estimation of scale functions to model heteroscedasticity by regularised kernel-based quantile methods
- Learning theory estimates with observations from general stationary stochastic processes
This page was built for publication: Estimating conditional quantiles with the help of the pinball loss
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q637098)