Statistical consistency and asymptotic normality for high-dimensional robust \(M\)-estimators
From MaRDI portal
Publication:2012209
DOI10.1214/16-AOS1471zbMath1371.62023arXiv1501.00312OpenAlexW2963927498MaRDI QIDQ2012209
Publication date: 28 July 2017
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1501.00312
asymptotic normalitynonconvex optimizationrobust regressionhigh-dimensional statisticssupport recovery\(M\)-estimatorsstatistical consistency
Asymptotic properties of parametric estimators (62F12) Estimation in multivariate analysis (62H12) Robustness and adaptive procedures (parametric inference) (62F35)
Related Items
The robust desparsified lasso and the focused information criterion for high-dimensional generalized linear models, Gaining Outlier Resistance With Progressive Quantiles: Fast Algorithms and Theoretical Studies, A shrinkage principle for heavy-tailed data: high-dimensional robust low-rank matrix recovery, The main contributions of robust statistics to statistical science and a new challenge, A New Principle for Tuning-Free Huber Regression, Weakly-convex–concave min–max optimization: provable algorithms and applications in machine learning, All-in-one robust estimator of the Gaussian mean, A high-dimensional M-estimator framework for bi-level variable selection, A unified theory of confidence regions and testing for high-dimensional estimating equations, Adaptive Huber regression on Markov-dependent data, High-Dimensional Learning Under Approximate Sparsity with Applications to Nonsmooth Estimation and Regularized Neural Networks, Manifold Sampling for Optimization of Nonconvex Functions That Are Piecewise Linear Compositions of Smooth Components, The robust nearest shrunken centroids classifier for high-dimensional heavy-tailed data, Penalised robust estimators for sparse and high-dimensional linear models, Robustness and Tractability for Non-convex M-estimators, High-dimensional robust regression with \(L_q\)-loss functions, Simultaneous feature selection and outlier detection with optimality guarantees, A refined convergence analysis of \(\mathrm{pDCA}_{e}\) with applications to simultaneous sparse recovery and outlier detection, Penalized wavelet nonparametric univariate logistic regression for irregular spaced data, Sparse Laplacian shrinkage for nonparametric transformation survival model, Robust variable selection and estimation via adaptive elastic net S-estimators for linear regression, Rate-optimal robust estimation of high-dimensional vector autoregressive models, M-estimators for models with a mix of discrete and continuous parameters, Models as approximations. I. Consequences illustrated with linear regression, Robust matrix estimations meet Frank-Wolfe algorithm, Additive Bayesian variable selection under censoring and misspecification, Robust inference for high‐dimensional single index models, Retire: robust expectile regression in high dimensions, Robust High-Dimensional Regression with Coefficient Thresholding and Its Application to Imaging Data Analysis, Differentially private inference via noisy optimization, Robust high-dimensional tuning free multiple testing, High-dimensional composite quantile regression: optimal statistical guarantees and fast algorithms, Structure learning via unstructured kernel-based M-estimation, Statistical analysis of sparse approximate factor models, Robust Signal Recovery for High-Dimensional Linear Log-Contrast Models with Compositional Covariates, Renewable Huber estimation method for streaming datasets, Finite-sample analysis of \(M\)-estimators using self-concordance, Oracle Inequalities for Local and Global Empirical Risk Minimizers, A Tuning-free Robust and Efficient Approach to High-dimensional Regression, Rejoinder to “A Tuning-Free Robust and Efficient Approach to High-Dimensional Regression”, Regression analysis: likelihood, error and entropy, The landscape of empirical risk for nonconvex losses, I-LAMM for sparse learning: simultaneous control of algorithmic complexity and statistical error, Asymptotic properties on high-dimensional multivariate regression M-estimation, Identifiability and estimation of meta-elliptical copula generators, Variable smoothing for weakly convex composite functions, Variance prior forms for high-dimensional Bayesian variable selection, Graphical-model based high dimensional generalized linear models, Iteratively reweighted \(\ell_1\)-penalized robust regression, An outer-inner linearization method for non-convex and nondifferentiable composite regularization problems, Adaptive sparse group LASSO in quantile regression, A statistical learning assessment of Huber regression, Wavelet-based robust estimation and variable selection in nonparametric additive models, Computational and statistical analyses for robust non-convex sparse regularized regression problem, Scale calibration for high-dimensional robust regression, Sparse regression for extreme values, The finite sample properties of sparse M-estimators with pseudo-observations, Asymptotic linear expansion of regularized M-estimators, Distributed adaptive Huber regression, High-dimensional robust approximated \(M\)-estimators for mean regression with asymmetric data, Nonbifurcating Phylogenetic Tree Inference via the Adaptive LASSO, Unnamed Item, Tractable Bayesian Variable Selection: Beyond Normality, Penalized wavelet estimation and robust denoising for irregular spaced data, High dimensional generalized linear models for temporal dependent data, Asymptotic Properties of Stationary Solutions of Coupled Nonconvex Nonsmooth Empirical Risk Minimization, Batch policy learning in average reward Markov decision processes