Quantile universal threshold
From MaRDI portal
Abstract: Efficient recovery of a low-dimensional structure from high-dimensional data has been pursued in various settings including wavelet denoising, generalized linear models and low-rank matrix estimation. By thresholding some parameters to zero, estimators such as lasso, elastic net and subset selection allow to perform not only parameter estimation but also variable selection, leading to sparsity. Yet one crucial step challenges all these estimators: the choice of the threshold parameter~. If too large, important features are missing; if too small, incorrect features are included. Within a unified framework, we propose a new selection of at the detection edge under the null model. To that aim, we introduce the concept of a zero-thresholding function and a null-thresholding statistic, that we explicitly derive for a large class of estimators. The new approach has the great advantage of transforming the selection of from an unknown scale to a probabilistic scale with the simple selection of a probability level. Numerical results show the effectiveness of our approach in terms of model selection and prediction.
Recommendations
- Thresholding in learning theory
- Regularized Bayesian estimation of generalized threshold regression models
- Bayesian regularized quantile regression
- Model selection in high-dimensional quantile regression with seamless L₀ penalty
- Best choices for regularization parameters in learning theory: on the bias-variance problem.
Cited in
(13)- A phase transition for finding needles in nonlinear haystacks with LASSO artificial neural networks
- Nonparametric Estimation of Galaxy Cluster Emissivity and Detection of Point Sources in Astrophysics With Two Lasso Penalties
- Low-rank model with covariates for count data with missing values
- On the use of cross-validation for the calibration of the adaptive Lasso
- Thresholding tests based on affine Lasso to achieve non-asymptotic nominal level and high power under sparse and dense alternatives in high dimension
- \(\ell_1\)-penalised ordinal polytomous regression estimators with application to gene expression studies
- A multi-resolution theory for approximating infinite-\(p\)-zero-\(n\): transitional inference, individualized predictions, and a world without bias-variance tradeoff
- qut
- Sparse additive models in high dimensions with wavelets
- Model Selection With Lasso-Zero: Adding Straw to the Haystack to Better Find Needles
- Variable Selection With Second-Generation P-Values
- Random thresholds for linear model selection
- Random threshold for linear model selection, revisited
This page was built for publication: Quantile universal threshold
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q131212)