Asymptotic behavior of M estimators of p regression parameters when \(p^ 2/n\) is large. II: Normal approximation
From MaRDI portal
Publication:1081230
DOI10.1214/aos/1176349744zbMath0601.62026OpenAlexW2022115022MaRDI QIDQ1081230
Publication date: 1985
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1214/aos/1176349744
weak convergencerobustnessconsistencyasymptotic normalityregression parametersM-estimatorgeneral linear modeluniform normal approximationasymptotic chi square distribution
Asymptotic distribution theory in statistics (62E20) Linear regression; mixed models (62J05) Robustness and adaptive procedures (parametric inference) (62F35)
Related Items
COMPLETE SUBSET AVERAGING FOR QUANTILE REGRESSIONS, Communication-efficient distributed estimator for generalized linear models with a diverging number of covariates, On the use of bootstrap with variational inference: theory, interpretation, and a two-sample test example, Asymptotics for high dimensional regression \(M\)-estimates: fixed design results, The asymptotic distribution of the MLE in high-dimensional logistic models: arbitrary covariance, Finite sample inference for quantile regression models, Thresholding least-squares inference in high-dimensional regression models, Adaptive Huber regression on Markov-dependent data, Persistence of plug-in rule in classification of high dimensional multivariate binary data, SCAD-Penalized Least Absolute Deviation Regression in High-Dimensional Models, Tobit regression model with parameters of increasing dimensions, Calibration of the empirical likelihood for high-dimensional data, Variable selection and parameter estimation via WLAD-SCAD with a diverging number of parameters, Augmented factor models with applications to validating market risk factors and forecasting bond risk premia, Robust post-selection inference of high-dimensional mean regression with heavy-tailed asymmetric or heteroskedastic errors, Empirical process of residuals for high-dimensional linear models, Polynomial spline estimation for generalized varying coefficient partially linear models with a diverging number of components, Group selection via adjusted weighted least absolute deviation regression, Pseudo maximum likelihood estimation of spatial autoregressive models with increasing dimension, Generalized \(F\) test for high dimensional linear regression coefficients, Asymptotics of hierarchical clustering for growing dimension, Doubly robust weighted composite quantile regression based on SCAD‐L2, Simplex-based Multinomial Logistic Regression with Diverging Numbers of Categories and Covariates, Estimation of a multiplicative correlation structure in the large dimensional case, Debiased lasso for generalized linear models with a diverging number of covariates, Asymptotic expansion of the posterior density in high dimensional generalized linear models, Predictive quantile regression with mixed roots and increasing dimensions: the ALQR approach, High dimensional semiparametric moment restriction models, Smoothed quantile regression with large-scale inference, Unnamed Item, Penalized \(M\)-estimation based on standard error adjusted adaptive elastic-net, Asymptotic properties of GEE with diverging dimension of covariates, Robust inference for high‐dimensional single index models, Sparse Reduced Rank Huber Regression in High Dimensions, STATISTICAL INFERENCE WITH F-STATISTICS WHEN FITTING SIMPLE MODELS TO HIGH-DIMENSIONAL DATA, Asymptotic properties of bridge estimators in sparse high-dimensional regression models, Dimension-agnostic inference using cross U-statistics, High-dimensional Bernstein-von Mises theorem for the Diaconis-Ylvisaker prior, GEE analysis of clustered binary data with diverging number of covariates, Robust Signal Recovery for High-Dimensional Linear Log-Contrast Models with Compositional Covariates, Nonparametric specification testing via the trinity of tests, Random graphs with a given degree sequence, Sparse group variable selection based on quantile hierarchical Lasso, Moderate-Dimensional Inferences on Quadratic Functionals in Ordinary Least Squares, Relaxing the assumptions of knockoffs by conditioning, Asymptotics and the theory of inference, Asymptotics for one-step m-estimators in regression with application to combining efficiency and high breakdown point, Empirical likelihood test for high dimensional linear models, Modified SCAD penalty for constrained variable selection problems, Statistical Inference for Average Treatment Effects Estimated by Synthetic Control Methods, Critical dimension in profile semiparametric estimation, On the asymptotic normality of Fourier flexible form estimates, On the residual empirical process based on the ALASSO in high dimensions and its functional oracle property, A relative error-based estimation with an increasing number of parameters, Adaptive Huber Regression, Residual bootstrap tests in linear models with many regressors, On the impact of predictor geometry on the performance on high-dimensional ridge-regularized generalized robust regression estimators, Nonparametric methods in multivariate factorial designs for large number of factor levels, Can we trust the bootstrap in high-dimension?, Inference in regression models with many regressors, Asymptotic properties of maximum quasi-likelihood estimators in generalized linear models with diverging number of covariates, Semi-varying coefficient models with a diverging number of components, The likelihood ratio test in high-dimensional logistic regression is asymptotically a rescaled Chi-square, Tests for \(p\)-regression coefficients in linear panel model when \(p\) is divergent, Rank procedures for a large number of treatments, A central limit theorem applicable to robust regression estimators, Quantile regression with varying coefficients, Asymptotics for estimation of quantile regressions with truncated infinite-dimensional proc\-ess\-es, PREDICTION‐FOCUSED MODEL SELECTION FOR AUTOREGRESSIVE MODELS, Bootstrap inference for instrumental variable models with many weak instruments, Evaluation and selection of models for out-of-sample prediction when the sample size is small relative to the complexity of the data-generating process, Necessary and sufficient conditions for variable selection consistency of the Lasso in high dimensions, Convergence and sparsity of Lasso and group Lasso in high-dimensional generalized linear models, A new perspective on robust \(M\)-estimation: finite sample theory and applications to dependence-adjusted multiple testing, Hypothesis testing in linear regression when \(k/n\) is large, On rank estimators in increasing dimensions, The \(k\)th power expectile regression, GMM inference when the number of moment conditions in large, High-dimensional linear models: a random matrix perspective, Unnamed Item, Adaptive-to-model checking for regressions with diverging number of predictors, Asymptotics when the number of parameters tends to infinity in the Bradley-Terry model for paired comparisons, On parameters of increasing dimensions, Distributed adaptive Huber regression, Quantile regression in partially linear varying coefficient models, Some contributions to M-estimation in linear models, Unnamed Item, Asymptotic normality of posterior distributions for exponential families when the number of parameters tends to infinity., Comment, Weak convergence of the empirical process of residuals in linear models with many parameters, Another look at the jackknife: Further examples of generalized bootstrap, Robust analysis of variance for a randomized block design, A Unifying Tutorial on Approximate Message Passing, Inference on higher-order spatial autoregressive models with increasingly many parameters, Asymptotic normality of robust \(M\)-estimators with convex penalty, Optimal prediction for linear regression with infinitely many parameters., On the central limit theorem in \(R^ p\) when p\(\rightarrow \infty\), Estimation in quantile regression models for correlated data with diverging number of covariates and large cluster sizes, Jackknife model averaging for quantile regressions