Smoothed Quantile Regression with Large-Scale Inference
From MaRDI portal
Publication:6355668
DOI10.1016/J.JECONOM.2021.07.010arXiv2012.05187MaRDI QIDQ6355668FDOQ6355668
Authors: Xuming He, Xiaoou Pan, Kean Ming Tan, Wen-Xin Zhou
Publication date: 9 December 2020
Abstract: Quantile regression is a powerful tool for learning the relationship between a response variable and a multivariate predictor while exploring heterogeneous effects. In this paper, we consider statistical inference for quantile regression with large-scale data in the "increasing dimension" regime. We provide a comprehensive and in-depth analysis of a convolution-type smoothing approach that achieves adequate approximation to computation and inference for quantile regression. This method, which we refer to as {it{conquer}}, turns the non-differentiable quantile loss function into a twice-differentiable, convex and locally strongly convex surrogate, which admits a fast and scalable Barzilai-Borwein gradient-based algorithm to perform optimization, and multiplier bootstrap for statistical inference. Theoretically, we establish explicit non-asymptotic bounds on both estimation and Bahadur-Kiefer linearization errors, from which we show that the asymptotic normality of the conquer estimator holds under a weaker requirement on the number of the regressors than needed for conventional quantile regression. Moreover, we prove the validity of multiplier bootstrap confidence constructions. Our numerical studies confirm the conquer estimator as a practical and reliable approach to large-scale inference for quantile regression. Software implementing the methodology is available in the exttt{R} package exttt{conquer}.
Statistics (62-XX) Game theory, economics, finance, and other social and behavioral sciences (91-XX)
This page was built for publication: Smoothed Quantile Regression with Large-Scale Inference
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6355668)