Adaptive Huber Regression
From MaRDI portal
Publication:3304852
DOI10.1080/01621459.2018.1543124zbMath1437.62250arXiv1706.06991OpenAlexW2896398456WikidataQ101166927 ScholiaQ101166927MaRDI QIDQ3304852
Wen-Xin Zhou, Qiang Sun, Jianqing Fan
Publication date: 3 August 2020
Published in: Journal of the American Statistical Association (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1706.06991
phase transitionheavy-tailed dataadaptive Huber regressionfinite-sample inferencebias and robustness tradeoffnonasymptotic optimality
Robustness and adaptive procedures (parametric inference) (62F35) Statistics of extreme values; tail inference (62G32) General nonlinear regression (62J02)
Related Items
The robust desparsified lasso and the focused information criterion for high-dimensional generalized linear models, A shrinkage principle for heavy-tailed data: high-dimensional robust low-rank matrix recovery, The main contributions of robust statistics to statistical science and a new challenge, A New Principle for Tuning-Free Huber Regression, FarmTest: Factor-Adjusted Robust Multiple Testing With Approximate False Discovery Control, The robust nearest shrunken centroids classifier for high-dimensional heavy-tailed data, Concentration study of M-estimators using the influence function, Safe feature screening rules for the regularized Huber regression, Robust post-selection inference of high-dimensional mean regression with heavy-tailed asymmetric or heteroskedastic errors, Robust sparse precision matrix estimation for high-dimensional compositional data, High-dimensional robust regression with \(L_q\)-loss functions, Automatic bias correction for testing in high‐dimensional linear models, High-dimensional \(M\)-estimation for Byzantine-robust decentralized learning, Distributed Sparse Composite Quantile Regression in Ultrahigh Dimensions, Huber estimation for the network autoregressive model, Non-asymptotic analysis and inference for an outlyingness induced winsorized mean, Adaptive robust large volatility matrix estimation based on high-frequency financial data, Large-Scale Inference of Multivariate Regression for Heavy-Tailed and Asymmetric Data, Accelerated sparse recovery via gradient descent with nonlinear conjugate gradient momentum, Robust variable selection and estimation via adaptive elastic net S-estimators for linear regression, Rate-optimal robust estimation of high-dimensional vector autoregressive models, Unnamed Item, High-dimensional robust inference for censored linear models, Robust matrix estimations meet Frank-Wolfe algorithm, Volatility models for stylized facts of high‐frequency financial data, Robust inference for high‐dimensional single index models, Sparse Reduced Rank Huber Regression in High Dimensions, Retire: robust expectile regression in high dimensions, Volatility prediction comparison via robust volatility proxies: an empirical deviation perspective, A semi-parametric approach to feature selection in high-dimensional linear regression models, Robust high-dimensional tuning free multiple testing, High-dimensional composite quantile regression: optimal statistical guarantees and fast algorithms, Diagnostic Testing of Finite Moment Conditions for the Consistency and Root-N Asymptotic Normality of the GMM and M Estimators, Robust Signal Recovery for High-Dimensional Linear Log-Contrast Models with Compositional Covariates, Renewable Huber estimation method for streaming datasets, Robust projected principal component analysis for large-dimensional semiparametric factor modeling, Nonasymptotic analysis of robust regression with modified Huber's loss, A Tuning-free Robust and Efficient Approach to High-dimensional Regression, Comment on “A Tuning-Free Robust and Efficient Approach to High-Dimensional Regression”, Robust pairwise learning with Huber loss, User-friendly covariance estimation for heavy-tailed distributions, Iteratively reweighted \(\ell_1\)-penalized robust regression, Matrix optimization based Euclidean embedding with outliers, Regularization parameter selection for the low rank matrix recovery, A statistical learning assessment of Huber regression, Degrees of freedom for regularized regression with Huber loss and linear constraints, A generalized Catoni's M-estimator under finite \(\alpha\)-th moment assumption with \(\alpha \in (1,2)\), Distributed adaptive Huber regression, Quantile regression feature selection and estimation with grouped variables using Huber approximation, Robust parameter estimation of regression models under weakened moment assumptions, Unnamed Item, Functional linear regression with Huber loss, Penalized unimodal spline density estimation with application to \(M\)-estimation
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The \(L_1\) penalized LAD estimator for high dimensional linear regression
- Sub-Gaussian mean estimators
- Statistics for high-dimensional data. Methods, theory and applications.
- Globally adaptive quantile regression with ultra-high dimensional data
- Empirical risk minimization for heavy-tailed losses
- SLOPE-adaptive variable selection via convex optimization
- On a notion of data depth based on random simplices
- Asymptotic behavior of M estimators of p regression parameters when \(p^ 2/n\) is large. II: Normal approximation
- Asymptotics with increasing dimension for robust regression with applications to the bootstrap
- Asymptotic behavior of M-estimators for the linear model
- A general Bahadur representation of \(M\)-estimators and its application to linear regression with nonstochastic designs
- Multivariate analysis by data depth: Descriptive statistics, graphics and inference. (With discussions and rejoinder)
- On parameters of increasing dimensions
- Robust covariance and scatter matrix estimation under Huber's contamination model
- Robust regression: Asymptotics, conjectures and Monte Carlo
- General notions of statistical depth function.
- On depth and deep points: A calculus.
- Least angle regression. (With discussion)
- Challenging the empirical mean and empirical variance: a deviation study
- Slope meets Lasso: improved oracle bounds and optimality
- Sub-Gaussian estimators of the mean of a random matrix with heavy-tailed entries
- Simultaneous analysis of Lasso and Dantzig selector
- Robust PCA and pairs of projections in a Hilbert space
- \(\ell_1\)-penalized quantile regression in high-dimensional sparse models
- Adaptive robust variable selection
- Asymptotically Minimax Adaptive Estimation. I: Upper Bounds. Optimally Adaptive Estimates
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Robustness and Accuracy of Methods for High Dimensional Data Analysis Based on Student’s t-Statistic
- Empirical properties of asset returns: stylized facts and statistical issues
- Location–Scale Depth
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- A High-Dimensional Nonparametric Multivariate Test for Mean Vector
- Estimation of High Dimensional Mean Regression in the Absence of Symmetry and Light Tail Assumptions
- Regularized M-estimators with nonconvexity: Statistical and algorithmic theory for local optima
- Robust Estimation of a Location Parameter
- Error Distribution for Gene Expression Data