scientific article
From MaRDI portal
Publication:3543468
zbMath1255.62198MaRDI QIDQ3543468
Cun-Hui Zhang, Shuangge Ma, Jian Huang
Publication date: 2 December 2008
Full work available at URL: http://www3.stat.sinica.edu.tw/statistica/J18N4/J18N420/J18N420.html
Title: zbMATH Open Web Interface contents unavailable due to conflicting licenses.
asymptotic normalityhigh-dimensional datavariable selectionpenalized regressionoracle propertyzero-consistency
Estimation in multivariate analysis (62H12) Asymptotic distribution theory in statistics (62E20) Linear regression; mixed models (62J05)
Related Items
Integrating Multisource Block-Wise Missing Data in Model Selection, Modeling association between multivariate correlated outcomes and high-dimensional sparse covariates: the adaptive SVS method, Robust sparse regression by modeling noise as a mixture of gaussians, Stability enhanced variable selection for a semiparametric model with flexible missingness mechanism and its application to the ChAMP study, Graphical group ridge, Sparse Composite Quantile Regression with Ultra-high Dimensional Heterogeneous Data, Solving the OSCAR and SLOPE Models Using a Semismooth Newton-Based Augmented Lagrangian Method, Sparse Sliced Inverse Regression via Cholesky Matrix Penalization, Semi-Standard Partial Covariance Variable Selection When Irrepresentable Conditions Fail, Automated Estimation of Heavy-Tailed Vector Error Correction Models, Estimation for High-Dimensional Linear Mixed-Effects Models Using ℓ1-Penalization, Test for high dimensional regression coefficients of partially linear models, Independence test in high-dimension using distance correlation and power enhancement technique, Rates of convergence of the adaptive elastic net and the post-selection procedure in ultra-high dimensional sparse models, Bayesian adaptive Lasso quantile regression, Sparse linear regression models of high dimensional covariates with non-Gaussian outliers and Berkson error-in-variable under heteroscedasticity, Sparsity identification for high-dimensional partially linear model with measurement error, Nonnegative estimation and variable selection via adaptive elastic-net for high-dimensional data, Toward an objective and reproducible model choice via variable selection deviation, Jackknife model averaging for high‐dimensional quantile regression, A Critical Review of LASSO and Its Derivatives for Variable Selection Under Dependence Among Covariates, Inferences for extended partially linear single-index models, Inference for the VEC(1) model with a heavy-tailed linear process errors*, A Method for Reducing the Number of Support Vectors in Fuzzy Support Vector Machine, Sparse Laplacian shrinkage for nonparametric transformation survival model, LASSO order selection for sparse autoregression: a bootstrap approach, One-step sparse estimates in the reverse penalty for high-dimensional correlated data, A primal dual active set with continuation algorithm for high-dimensional nonconvex SICA-penalized regression, Neuronized Priors for Bayesian Sparse Linear Regression, Unnamed Item, UNIFORM ASYMPTOTICS AND CONFIDENCE REGIONS BASED ON THE ADAPTIVE LASSO WITH PARTIALLY CONSISTENT TUNING, Post-selection Inference of High-dimensional Logistic Regression Under Case–Control Design, Empirical likelihood based tests for detecting the presence of significant predictors in marginal quantile regression, Locally Sparse Function-on-Function Regression, Unnamed Item, Adaptive Lasso for generalized linear models with a diverging number of parameters, Model-Free Forward Screening Via Cumulative Divergence, Analyzing large datasets with bootstrap penalization, Parsimonious Model Averaging With a Diverging Number of Parameters, A Bootstrap Lasso + Partial Ridge Method to Construct Confidence Intervals for Parameters in High-dimensional Sparse Linear Models, Hard thresholding regression, Evaluation of generalized degrees of freedom for sparse estimation by replica method, Discussion of “From Fixed-X to Random-X Regression: Bias-Variance Decompositions, Covariance Penalties, and Prediction Error Estimation”, Greedy forward regression for variable screening, The Lasso for High Dimensional Regression with a Possible Change Point, Confidence Intervals for Low Dimensional Parameters in High Dimensional Linear Models, Unnamed Item, Unnamed Item, The EBIC and a sequential procedure for feature selection in interactive linear models with high-dimensional data, On the grouped selection and model complexity of the adaptive elastic net, High-dimensional regression with unknown variance, A general theory of concave regularization for high-dimensional sparse estimation problems, Shrinkage and Penalty Estimators of a Poisson Regression Model, An ADMM with continuation algorithm for non-convex SICA-penalized regression in high dimensions, A Tailored Multivariate Mixture Model for Detecting Proteins of Concordant Change Among Virulent Strains of Clostridium Perfringens, Two tales of variable selection for high dimensional regression: Screening and model building, Weighted linear programming discriminant analysis for high‐dimensional binary classification, An iterative approach to distance correlation-based sure independence screening, Application of shrinkage estimation in linear regression models with autoregressive errors, Regularized Estimation for the Accelerated Failure Time Model, Multiple Influential Point Detection in High Dimensional Regression Spaces, Sparse Minimum Discrepancy Approach to Sufficient Dimension Reduction with Simultaneous Variable Selection in Ultrahigh Dimension, Sparsity Oriented Importance Learning for High-Dimensional Linear Regression, Consistent two‐stage multiple change‐point detection in linear models, On the grouped selection and model complexity of the adaptive elastic net, Stability Selection, Unnamed Item, Univariate measurement error selection likelihood for variable selection of additive model, Oracle Efficient Estimation of Structural Breaks in Cointegrating Regressions, Nonnegative adaptive Lasso for ultra-high dimensional regression models and a two-stage method applied in financial modeling, Using random subspace method for prediction and variable importance assessment in linear regression, Variable selection in high-dimensional linear model with possibly asymmetric errors, Selection of fixed effects in high dimensional linear mixed models using a multicycle ECM algorithm, Confidence intervals for high-dimensional partially linear single-index models, Near-ideal model selection by \(\ell _{1}\) minimization, RCV-based error density estimation in the ultrahigh dimensional additive model, An alternating direction method of multipliers for MCP-penalized regression with high-dimensional data, Variable selection in censored quantile regression with high dimensional data, Thresholding least-squares inference in high-dimensional regression models, Random subspace method for high-dimensional regression with the \texttt{R} package \texttt{regRSM}, Testing a single regression coefficient in high dimensional linear models, Least squares approximation with a diverging number of parameters, A rank-corrected procedure for matrix completion with fixed basis coefficients, Penalised inference for lagged dependent regression in the presence of autocorrelated residuals, The growth rate of significant regressors for high dimensional data, Adaptive bridge estimation for high-dimensional regression models, Minimizing variable selection criteria by Markov chain Monte Carlo, Two-layer EM algorithm for ALD mixture regression models: a new solution to composite quantile regression, A relative error-based approach for variable selection, Iteratively reweighted adaptive Lasso for conditional heteroscedastic time series with applications to AR-ARCH type processes, A unifying framework of high-dimensional sparse estimation with difference-of-convex (DC) regularizations, A weight function method for selection of proteins to predict an outcome using protein expression data, Moderately clipped Lasso, The adaptive Lasso in high-dimensional sparse heteroscedastic models, Gene set priorization guided by regulatory networks with p-values through kernel mixed model, Rates of convergence of the adaptive LASSO estimators to the oracle distribution and higher order refinements by the bootstrap, Statistical significance in high-dimensional linear models, Penalised robust estimators for sparse and high-dimensional linear models, Sparse semiparametric regression when predictors are mixture of functional and high-dimensional variables, Relaxed sparse eigenvalue conditions for sparse estimation via non-convex regularized regression, Hypothesis testing for regional quantiles, Modal additive models with data-driven structure identification, Asymptotic properties of Lasso+mLS and Lasso+Ridge in sparse high-dimensional linear regression, Generalized \(F\) test for high dimensional linear regression coefficients, Globally adaptive quantile regression with ultra-high dimensional data, \(\ell_{1}\)-penalization for mixture regression models, Model selection via standard error adjusted adaptive Lasso, Bridge estimators and the adaptive Lasso under heteroscedasticity, Regularized quantile regression for ultrahigh-dimensional data with nonignorable missing responses, Consistent tuning parameter selection in high-dimensional group-penalized regression, Efficient estimation of approximate factor models via penalized maximum likelihood, \(\ell_1\)-regularization of high-dimensional time-series models with non-Gaussian and heteroskedastic errors, Needles and straw in a haystack: posterior concentration for possibly sparse sequences, Consistent group selection in high-dimensional linear regression, Dimension reduction based linear surrogate variable approach for model free variable selection, Generalized F-test for high dimensional regression coefficients of partially linear models, High-dimensional VARs with common factors, Sharp support recovery from noisy random measurements by \(\ell_1\)-minimization, Coordinate ascent for penalized semiparametric regression on high-dimensional panel count data, Model-free conditional independence feature screening for ultrahigh dimensional data, Ultra-high dimensional variable screening via Gram-Schmidt orthogonalization, The adaptive and the thresholded Lasso for potentially misspecified models (and a lower bound for the Lasso), Calibrating nonconvex penalized regression in ultra-high dimension, High-dimensional influence measure, New estimation and inference procedures for a single-index conditional distribution model, Regularization and variable selection for infinite variance autoregressive models, Simultaneous variable selection and smoothing for high-dimensional function-on-scalar regression, Boosting algorithms: regularization, prediction and model fitting, SCAD penalized rank regression with a diverging number of parameters, Concave group methods for variable selection and estimation in high-dimensional varying coefficient models, Adaptive Lasso estimators for ultrahigh dimensional generalized linear models, Shrinkage, pretest, and penalty estimators in generalized linear models, Nonconvex penalized ridge estimations for partially linear additive models in ultrahigh dimension, Discussion: One-step sparse estimates in nonconcave penalized likelihood models, The sparsity and bias of the LASSO selection in high-dimensional linear regression, On the residual empirical process based on the ALASSO in high dimensions and its functional oracle property, Estimation of average treatment effects with panel data: asymptotic theory and implementation, Variable screening for high dimensional time series, High dimensional censored quantile regression, DCA based algorithms for feature selection in multi-class support vector machine, Sparse and efficient estimation for partial spline models with increasing dimension, Improved variable selection with forward-lasso adaptive shrinkage, Sparse classification with paired covariates, Least angle and \(\ell _{1}\) penalized regression: a review, Lasso Inference for High-Dimensional Time Series, Parametric and semiparametric reduced-rank regression with flexible sparsity, Estimator selection in the Gaussian setting, Tests for \(p\)-regression coefficients in linear panel model when \(p\) is divergent, Nearly unbiased variable selection under minimax concave penalty, Variable selection in nonparametric additive models, A flexible shrinkage operator for fussy grouped variable selection, Endogeneity in high dimensions, A note on the one-step estimator for ultrahigh dimensionality, Pursuing Sources of Heterogeneity in Modeling Clustered Population, Variable selection after screening: with or without data splitting?, Test for conditional independence with application to conditional screening, Adaptive group bridge selection in the semiparametric accelerated failure time model, Fixed-effects dynamic spatial panel data models and impulse response analysis, Adaptive function-on-scalar regression with a smoothing elastic net, Variable selection in the accelerated failure time model via the bridge method, Adaptive sparse group LASSO in quantile regression, Error density estimation in high-dimensional sparse linear model, Nonnegative estimation and variable selection under minimax concave penalty for sparse high-dimensional linear regression models, Path and directionality discovery in individual dynamic models: a regularized unified structural equation modeling approach for hybrid vector autoregression, Optimal linear discriminators for the discrete choice model in growing dimensions, Fourier transform sparse inverse regression estimators for sufficient variable selection, Penalized wavelet estimation and robust denoising for irregular spaced data, Lasso and probabilistic inequalities for multivariate point processes, On the oracle property of adaptive group Lasso in high-dimensional linear models