On the adaptive elastic net with a diverging number of parameters
From MaRDI portal
Publication:2388979
Abstract: We consider the problem of model selection and estimation in situations where the number of parameters diverges with the sample size. When the dimension is high, an ideal method should have the oracle property [J. Amer. Statist. Assoc. 96 (2001) 1348--1360] and [Ann. Statist. 32 (2004) 928--961] which ensures the optimal large sample performance. Furthermore, the high-dimensionality often induces the collinearity problem, which should be properly handled by the ideal method. Many existing variable selection methods fail to achieve both goals simultaneously. In this paper, we propose the adaptive elastic-net that combines the strengths of the quadratic regularization and the adaptively weighted lasso shrinkage. Under weak regularity conditions, we establish the oracle property of the adaptive elastic-net. We show by simulations that the adaptive elastic-net deals with the collinearity problem better than the other oracle-like methods, thus enjoying much improved finite sample performance.
Recommendations
- On the oracle property of a generalized adaptive elastic-net for multivariate linear regression with a diverging number of parameters
- A group adaptive elastic-net approach for variable selection in high-dimensional linear regression
- The adaptive gril estimator with a diverging number of parameters
- Adaptive lasso for generalized linear models with a diverging number of parameters
- Regularization and Variable Selection Via the Elastic Net
Cites work
- scientific article; zbMATH DE number 739533 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- Asymptotic behavior of M-estimators of p regression parameters when p^ 2/n is large. I. Consistency
- Asymptotics for Lasso-type estimators.
- Enhancing sparsity by reweighted \(\ell _{1}\) minimization
- Heuristics of instability and stabilization in model selection
- Ideal spatial adaptation by wavelet shrinkage
- Least angle regression. (With discussion)
- Nonconcave penalized likelihood with a diverging number of parameters.
- On the ``degrees of freedom of the lasso
- Profile-kernel likelihood inference with diverging number of parameters
- Regularization and Variable Selection Via the Elastic Net
- Robust regression: Asymptotics, conjectures and Monte Carlo
- Semilinear High-Dimensional Model for Normalization of Microarray Data
- Statistical challenges with high dimensionality: feature selection in knowledge discovery
- The Adaptive Lasso and Its Oracle Properties
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Tuning parameter selectors for the smoothly clipped absolute deviation method
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
Cited in
(only showing first 100 items - show all)- Oracle estimation of parametric transformation models
- Two-step adaptive model selection for vector autoregressive processes
- Synthesizing external aggregated information in the presence of population heterogeneity: A penalized empirical likelihood approach
- Adaptive and reversed penalty for analysis of high-dimensional correlated data
- Efficient penalized estimation for linear regression model
- One-step sparse estimates in the reverse penalty for high-dimensional correlated data
- Combined-penalized likelihood estimations with a diverging number of parameters
- Penalized \(M\)-estimation based on standard error adjusted adaptive elastic-net
- Data mining for longitudinal data under multicollinearity and time dependence using penalized generalized estimating equations
- High-dimensional linear regression with hard thresholding regularization: theory and algorithm
- Nonnegative estimation and variable selection via adaptive elastic-net for high-dimensional data
- Feature screening via distance correlation learning
- Consistent tuning parameter selection in high-dimensional group-penalized regression
- Inference for low‐ and high‐dimensional inhomogeneous Gibbs point processes
- Regression adjustment for treatment effect with multicollinearity in high dimensions
- Robust variable selection for generalized linear models with a diverging number of parameters
- On the oracle property of adaptive group Lasso in high-dimensional linear models
- A sparse additive model for high-dimensional interactions with an exposure variable
- Variable selection in high-dimensional linear model with possibly asymmetric errors
- Generalized co-sparse factor regression
- Regularization-based bootstrap ranking model: identifying healthcare indicators among all level income economies
- A consistent and numerically efficient variable selection method for sparse Poisson regression with applications to learning and signal recovery
- Penalized profile least squares-based statistical inference for varying coefficient partially linear errors-in-variables models
- The smooth-Lasso and other \(\ell _{1}+\ell _{2}\)-penalized methods
- Smooth LASSO estimator for the function-on-function linear regression model
- Doubly robust weighted composite quantile regression based on SCAD‐L2
- Group sparse recovery via group square-root elastic net and the iterative multivariate thresholding-based algorithm
- Manifold elastic net: a unified framework for sparse dimension reduction
- Modified SCAD penalty for constrained variable selection problems
- Nonconvex penalized ridge estimations for partially linear additive models in ultrahigh dimension
- Majorization-minimization algorithms for nonsmoothly penalized objective functions
- Negative binomial factor regression with application to microbiome data analysis
- A generalized bridge regression in fuzzy environment and its numerical solution by a capable recurrent neural network
- Predictive stability criteria for penalty selection in linear models
- Penalized empirical likelihood for high-dimensional partially linear varying coefficient model with measurement errors
- Nonnegative adaptive Lasso for ultra-high dimensional regression models and a two-stage method applied in financial modeling
- The use of random-effect models for high-dimensional variable selection problems
- A new data adaptive elastic net predictive model using hybridized smoothed covariance estimators with information complexity
- On Hodges' superefficiency and merits of oracle property in model selection
- An Interactive Greedy Approach to Group Sparsity in High Dimensions
- Sparse reduced-rank regression for simultaneous dimension reduction and variable selection
- Model selection via standard error adjusted adaptive Lasso
- Profiled adaptive elastic-net procedure for partially linear models with high-dimensional covar\-i\-ates
- Robust variable selection and estimation via adaptive elastic net S-estimators for linear regression
- Informative gene selection for microarray classification via adaptive elastic net with conditional mutual information
- Variable selection through adaptive elastic net for proportional odds model
- Balanced estimation for high-dimensional measurement error models
- Penalized variable selection for accelerated failure time models with random effects
- Robust variable selection in semiparametric mean-covariance regression for longitudinal data analysis
- Fast iterative regularization by reusing data
- Model selection consistency of Lasso for empirical data
- A majorization-minimization approach to variable selection using spike and slab priors
- The finite sample properties of sparse M-estimators with pseudo-observations
- Convex and non-convex regularization methods for spatial point processes intensity estimation
- Group variable selection via SCAD-L2
- Variable selection and collinearity processing for multivariate data via row-elastic-net regularization
- Combining phenotypic and genomic data to improve prediction of binary traits
- Overview of robust variable selection methods for high-dimensional linear regression model
- Double fused Lasso regularized regression with both matrix and vector valued predictors
- Weighted elastic net penalized mean-variance portfolio design and computation
- The adaptive gril estimator with a diverging number of parameters
- Endogeneity in high dimensions
- The Penalized Analytic Center Estimator
- Improving accuracy models using elastic net regression approach based on empirical mode decomposition
- The information detection for the generalized additive model
- Model selection and structure specification in ultra-high dimensional generalised semi-varying coefficient models
- Fast and accurate variational inference for large Bayesian VARs with stochastic volatility
- On the strong oracle property of concave penalized estimators with infinite penalty derivative at the origin
- Penalized logistic regression with prior information for microarray gene expression classification
- One-step sparse ridge estimation with folded concave penalty
- Robust variable selection for finite mixture regression models
- AN ASYMPTOTIC THEORY FOR LEAST SQUARES MODEL AVERAGING WITH NESTED MODELS
- Penalized regression models with autoregressive error terms
- Scalable algorithms for semiparametric accelerated failure time models in high dimensions
- Variable selection for survival data with a class of adaptive elastic net techniques
- Asymptotic properties of GEE estimator for clustered ordinal data with high-dimensional covariates
- Efficient regularized regression with \(L_0\) penalty for variable selection and network construction
- Stable prediction in high-dimensional linear models
- Model-free feature screening via a modified composite quantile correlation
- Shrinkage tuning parameter selection with a diverging number of parameters
- Cluster feature selection in high-dimensional linear models
- The reciprocal Bayesian bridge for left-censored data
- Subset selection for vector autoregressive processes via adaptive Lasso
- Adaptive elastic-net selection in a quantile model with diverging number of variable groups
- Nonnegative estimation and variable selection under minimax concave penalty for sparse high-dimensional linear regression models
- Variable selection in linear mixed effects models
- \(\ell_0\)-regularized high-dimensional accelerated failure time model
- Some recent statistical learning methods for longitudinal high-dimensional data
- Variable selection using \(L_q\) penalties
- Variable selection using P-splines
- Distributed subsampling for multiplicative regression
- Variable selection in linear mixed models using an extended class of penalties
- An improved variable selection procedure for adaptive Lasso in high-dimensional survival analysis
- Synthesizing external aggregated information in the penalized Cox regression under population heterogeneity
- Quadratic approximation for nonconvex penalized estimations with a diverging number of parameters
- High-dimensional index tracking based on the adaptive elastic net
- Grouping Variable Selection by Weight Fused Elastic Net for Multi-Collinear Data
- Variable selection for varying-coefficient models with the sparse regularization
- Adaptive fused LASSO in grouped quantile regression
- Reader reaction to “Outcome‐adaptive lasso: Variable selection for causal inference” by Shortreed and Ertefaie (2017)
This page was built for publication: On the adaptive elastic net with a diverging number of parameters
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2388979)