High-dimensional robust approximated M-estimators for mean regression with asymmetric data

From MaRDI portal
Publication:6327615

DOI10.1016/J.JMVA.2022.105080arXiv1910.09493WikidataQ113870423 ScholiaQ113870423MaRDI QIDQ6327615FDOQ6327615


Authors: Bin Luo, Xiaoli Gao Edit this on Wikidata


Publication date: 21 October 2019

Abstract: Asymmetry along with heteroscedasticity or contamination often occurs with the growth of data dimensionality. In ultra-high dimensional data analysis, such irregular settings are usually overlooked for both theoretical and computational convenience. In this paper, we establish a framework for estimation in high-dimensional regression models using Penalized Robust Approximated quadratic M-estimators (PRAM). This framework allows general settings such as random errors lack of symmetry and homogeneity, or the covariates are not sub-Gaussian. To reduce the possible bias caused by the data's irregularity in mean regression, PRAM adopts a loss function with a flexible robustness parameter growing with the sample size. Theoretically, we first show that, in the ultra-high dimension setting, PRAM estimators have local estimation consistency at the minimax rate enjoyed by the LS-Lasso. Then we show that PRAM with an appropriate non-convex penalty in fact agrees with the local oracle solution, and thus obtain its oracle property. Computationally, we demonstrate the performances of six PRAM estimators using three types of loss functions for approximation (Huber, Tukey's biweight and Cauchy loss) combined with two types of penalty functions (Lasso and MCP). Our simulation studies and real data analysis demonstrate satisfactory finite sample performances of the PRAM estimator under general irregular settings.













This page was built for publication: High-dimensional robust approximated M-estimators for mean regression with asymmetric data

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6327615)