High-dimensional composite quantile regression: optimal statistical guarantees and fast algorithms
From MaRDI portal
Publication:6184871
Abstract: The composite quantile regression (CQR) was introduced by Zou and Yuan [Ann. Statist. 36 (2008) 1108--1126] as a robust regression method for linear models with heavy-tailed errors while achieving high efficiency. Its penalized counterpart for high-dimensional sparse models was recently studied in Gu and Zou [IEEE Trans. Inf. Theory 66 (2020) 7132--7154], along with a specialized optimization algorithm based on the alternating direct method of multipliers (ADMM). Compared to the various first-order algorithms for penalized least squares, ADMM-based algorithms are not well-adapted to large-scale problems. To overcome this computational hardness, in this paper we employ a convolution-smoothed technique to CQR, complemented with iteratively reweighted -regularization. The smoothed composite loss function is convex, twice continuously differentiable, and locally strong convex with high probability. We propose a gradient-based algorithm for penalized smoothed CQR via a variant of the majorize-minimization principal, which gains substantial computational efficiency over ADMM. Theoretically, we show that the iteratively reweighted -penalized smoothed CQR estimator achieves near-minimax optimal convergence rate under heavy-tailed errors without any moment constraint, and further achieves near-oracle convergence rate under a weaker minimum signal strength condition than needed in Gu and Zou (2020). Numerical studies demonstrate that the proposed method exhibits significant computational advantages without compromising statistical performance compared to two state-of-the-art methods that achieve robustness and high efficiency simultaneously.
Recommendations
- Composite quantile regression for ultra-high dimensional semiparametric model averaging
- Composite quantile regression for massive datasets
- Sparse Composite Quantile Regression with Ultra-high Dimensional Heterogeneous Data
- Single-index composite quantile regression for ultra-high-dimensional data
- Sparse Composite Quantile Regression in Ultrahigh Dimensions With Tuning Parameter Calibration
- Advanced algorithms for penalized quantile and composite quantile regression
- Optimal subsampling algorithms for composite quantile regression in massive data
- A note on the efficiency of composite quantile regression
- An effective method to reduce the computational complexity of composite quantile regression
- Distributed Sparse Composite Quantile Regression in Ultrahigh Dimensions
Cites work
- scientific article; zbMATH DE number 49190 (Why is no real title available?)
- scientific article; zbMATH DE number 2034517 (Why is no real title available?)
- scientific article; zbMATH DE number 4001209 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- scientific article; zbMATH DE number 6438182 (Why is no real title available?)
- A general theory of concave regularization for high-dimensional sparse estimation problems
- Adaptive Huber Regression
- Atomic Decomposition by Basis Pursuit
- Composite quantile regression and the oracle model selection theory
- Concentration inequalities. A nonasymptotic theory of independence
- Estimation of High Dimensional Mean Regression in the Absence of Symmetry and Light Tail Assumptions
- High-Dimensional Quantile Regression: Convolution Smoothing and Concave Regularization
- High-dimensional probability. An introduction with applications in data science
- High-dimensional statistics. A non-asymptotic viewpoint
- I-LAMM for sparse learning: simultaneous control of algorithmic complexity and statistical error
- Local Composite Quantile Regression Smoothing: An Efficient and Safe Alternative to Local Polynomial Regression
- Minimum distance Lasso for robust high-dimensional regression
- Model Selection via Bayesian Information Criterion for Quantile Regression Models
- Nearly unbiased variable selection under minimax concave penalty
- One-step sparse estimates in nonconcave penalized likelihood models
- Optimization with sparsity-inducing penalties
- Parallelizing the dual revised simplex method
- Parametric estimation. Finite sample theory
- Penalized composite quasi-likelihood for ultrahigh dimensional variable selection
- Reconstruction From Anisotropic Random Measurements
- Regression Quantiles
- Rejoinder to “A Tuning-Free Robust and Efficient Approach to High-Dimensional Regression”
- Robust Variable Selection With Exponential Squared Loss
- Robust and consistent variable selection in high-dimensional generalized linear models
- Scaling-up empirical risk minimization: optimization of incomplete \(U\)-statistics
- Simultaneous analysis of Lasso and Dantzig selector
- Smoothed quantile regression with large-scale inference
- Sparse Composite Quantile Regression in Ultrahigh Dimensions With Tuning Parameter Calibration
- Statistical consistency and asymptotic normality for high-dimensional robust \(M\)-estimators
- Statistical foundations of data science
- Statistics for high-dimensional data. Methods, theory and applications.
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Weak convergence and empirical processes. With applications to statistics
- \(\ell_1\)-penalized quantile regression in high-dimensional sparse models
Cited in
(6)- An adaptive composite quantile approach to dimension reduction
- Distributed Sparse Composite Quantile Regression in Ultrahigh Dimensions
- Composite quantile regression for massive datasets
- Composite quantile regression for ultra-high dimensional semiparametric model averaging
- Advanced algorithms for penalized quantile and composite quantile regression
- Sparse Composite Quantile Regression in Ultrahigh Dimensions With Tuning Parameter Calibration
This page was built for publication: High-dimensional composite quantile regression: optimal statistical guarantees and fast algorithms
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6184871)