Robust post-selection inference of high-dimensional mean regression with heavy-tailed asymmetric or heteroskedastic errors
From MaRDI portal
Publication:2172011
Cites work
- scientific article; zbMATH DE number 5957408 (Why is no real title available?)
- scientific article; zbMATH DE number 3169866 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- A general Bahadur representation of \(M\)-estimators and its application to linear regression with nonstochastic designs
- A general theory of hypothesis tests and confidence regions for sparse high dimensional models
- A high-dimensional nonparametric multivariate test for mean vector
- A new perspective on robust \(M\)-estimation: finite sample theory and applications to dependence-adjusted multiple testing
- Adaptive Huber Regression
- Adaptive robust variable selection
- Asymptotic behavior of M estimators of p regression parameters when \(p^ 2/n\) is large. II: Normal approximation
- Asymptotic behavior of M-estimators for the linear model
- Asymptotic properties of bridge estimators in sparse high-dimensional regression models
- Composite quantile regression and the oracle model selection theory
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- Confidence intervals for high-dimensional inverse covariance estimation
- Confidence intervals for high-dimensional linear regression: minimax rates and adaptivity
- Confidence intervals for low dimensional parameters in high dimensional linear models
- Empirical properties of asset returns: stylized facts and statistical issues
- Estimation of High Dimensional Mean Regression in the Absence of Symmetry and Light Tail Assumptions
- Feature selection for varying coefficient models with ultrahigh-dimensional covariates
- High-Dimensional Variable Selection for Survival Data
- High-dimensional graphs and variable selection with the Lasso
- High-dimensional probability. An introduction with applications in data science
- Nearly unbiased variable selection under minimax concave penalty
- On asymptotically optimal confidence regions and tests for high-dimensional models
- On parameters of increasing dimensions
- One-Step Huber Estimates in the Linear Model
- Penalized composite quasi-likelihood for ultrahigh dimensional variable selection
- Robust Estimation of a Location Parameter
- Robust regression: Asymptotics, conjectures and Monte Carlo
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Sure independence screening for ultrahigh dimensional feature space. With discussion and authors' reply
- The \(L_1\) penalized LAD estimator for high dimensional linear regression
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Uniform post-selection inference for least absolute deviation regression and other Z-estimation problems
- Valid post-selection inference in high-dimensional approximately sparse quantile regression models
- Variable selection in quantile regression
- \(L_1\)-regularized least squares for support recovery of high dimensional single index models with Gaussian designs
- \(\ell_1\)-penalized quantile regression in high-dimensional sparse models
Cited in
(6)- Optimal decorrelated score subsampling for generalized linear models with massive data
- Renewable Huber estimation method for streaming datasets
- Adaptive Huber trace regression with low-rank matrix parameter via nonconvex regularization
- High-dimensional robust approximated \(M\)-estimators for mean regression with asymmetric data
- Estimation of High Dimensional Mean Regression in the Absence of Symmetry and Light Tail Assumptions
- Double debiased transfer learning for adaptive Huber regression
This page was built for publication: Robust post-selection inference of high-dimensional mean regression with heavy-tailed asymmetric or heteroskedastic errors
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2172011)