Estimating conditional quantiles with the help of the pinball loss
From MaRDI portal
(Redirected from Publication:637098)
Abstract: The so-called pinball loss for estimating conditional quantiles is a well-known tool in both statistics and machine learning. So far, however, only little work has been done to quantify the efficiency of this tool for nonparametric approaches. We fill this gap by establishing inequalities that describe how close approximate pinball risk minimizers are to the corresponding conditional quantile. These inequalities, which hold under mild assumptions on the data-generating distribution, are then used to establish so-called variance bounds, which recently turned out to play an important role in the statistical analysis of (regularized) empirical risk minimization approaches. Finally, we use both types of inequalities to establish an oracle inequality for support vector machines that use the pinball loss. The resulting learning rates are min--max optimal under some standard regularity assumptions on the conditional quantile.
Recommendations
- Quantile estimators with orthogonal pinball loss function
- Conditional quantile estimation through optimal quantization
- Conditional quantile estimation based on optimal quantization: from theory to practice
- Conditional quantile estimation using optimal quantization: a numerical study
- Quasi-maximum likelihood estimation for conditional quantiles
- Estimation of conditional quantile density function
- Non-parametric estimation of conditional quantiles
- Conditional empirical likelihood for quantile regression models
- Estimation of high conditional quantiles for heavy-tailed distributions
Cites work
- scientific article; zbMATH DE number 5957364 (Why is no real title available?)
- scientific article; zbMATH DE number 1804108 (Why is no real title available?)
- scientific article; zbMATH DE number 44592 (Why is no real title available?)
- scientific article; zbMATH DE number 192914 (Why is no real title available?)
- scientific article; zbMATH DE number 1827090 (Why is no real title available?)
- scientific article; zbMATH DE number 962825 (Why is no real title available?)
- Bi-level path following for cross validated solution of kernel quantile regression
- Convexity, Classification, and Risk Bounds
- How to compare different loss functions and their risks
- Information-theoretic determination of minimax rates of convergence
- Learning Theory
- Local Rademacher complexities
- Measure and integration theory. Transl. from the German by Robert B. Burckel
- On consistency and robustness properties of support vector machines for heavy-tailed distributions
- Optimal aggregation of classifiers in statistical learning.
- Optimal estimators in learning theory
- Optimal rates for the regularized least-squares algorithm
- Oracle inequalities for support vector machines that are based on random entropy numbers
- PIECEWISE-POLYNOMIAL APPROXIMATIONS OF FUNCTIONS OF THE CLASSES $ W_{p}^{\alpha}$
- Quantile regression.
- Regularization in kernel learning
- Smooth discrimination analysis
- Some applications of concentration inequalities to statistics
- Statistical performance of support vector machines
- Support Vector Machines
Cited in
(51)- Conformal Prediction: A Gentle Introduction
- Estimation of scale functions to model heteroscedasticity by regularised kernel-based quantile methods
- Kernel-based sparse regression with the correntropy-induced loss
- Learning theory estimates with observations from general stationary stochastic processes
- Convergence rate of SVM for kernel-based robust regression
- A simpler approach to coefficient regularized support vector machines regression
- Calibration of \(\epsilon\)-insensitive loss in support vector machines regression
- Online learning for quantile regression and support vector regression
- An SVM-like approach for expectile regression
- Asymmetric least squares support vector machine classifiers
- Asymmetric \(\nu\)-tube support vector regression
- Oracle inequalities for sparse additive quantile regression in reproducing kernel Hilbert space
- Robust kernel-based distribution regression
- Mitigating robust overfitting via self-residual-calibration regularization
- Measuring the capacity of sets of functions in the analysis of ERM
- Statistical consistency of coefficient-based conditional quantile regression
- Learning rates for the kernel regularized regression with a differentiable strongly convex loss
- A new support vector machine plus with pinball loss
- Sparse additive machine with ramp loss
- Approximation analysis of learning algorithms for support vector regression and quantile regression
- Separability of reproducing kernel spaces
- Quantile regression with \(\ell_1\)-regularization and Gaussian kernels
- Large margin unified machines with non-i.i.d. process
- Analysis of regression algorithms with unbounded sampling
- A unified penalized method for sparse additive quantile models: an RKHS approach
- scientific article; zbMATH DE number 7450736 (Why is no real title available?)
- Quantile estimators with orthogonal pinball loss function
- scientific article; zbMATH DE number 7415083 (Why is no real title available?)
- Robust support vector quantile regression with truncated pinball loss (RSVQR)
- Stochastic online convex optimization. Application to probabilistic time series forecasting
- Conditional quantiles with varying Gaussians
- A robust regression framework with Laplace kernel-induced loss
- Nonparametric estimation of a maximum of quantiles
- An introduction to copula-based bivariate reliability concepts
- On Lagrangian L2-norm pinball twin bounded support vector machine via unconstrained convex minimization
- Coefficient-based regularization network with variance loss for error
- Perturbation of convex risk minimization and its application in differential private learning algorithms
- A new comparison theorem on conditional quantiles
- Adaptive learning rates for support vector machines working on data with low intrinsic dimension
- Optimal regression rates for SVMs using Gaussian kernels
- Approximation on variable exponent spaces by linear integral operators
- Estimation of quantile oriented sensitivity indices
- Quantitative convergence analysis of kernel based large-margin unified machines
- Moving quantile regression
- Conformalized-DeepONet: a distribution-free framework for uncertainty quantification in deep operator networks
- Analysis of approximation by linear operators on variable \(L_\rho^{p(\cdot)}\) spaces and applications in learning theory
- Learning with varying insensitive loss
- Error analysis of classification learning algorithms based on LUMs loss
- Learning rates for kernel-based expectile regression
- Sparse online regression algorithm with insensitive loss functions
- Fast learning from \(\alpha\)-mixing observations
This page was built for publication: Estimating conditional quantiles with the help of the pinball loss
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q637098)