On the adaptive elastic net with a diverging number of parameters
From MaRDI portal
Publication:2388979
DOI10.1214/08-AOS625zbMATH Open1168.62064arXiv0908.1836WikidataQ40349223 ScholiaQ40349223MaRDI QIDQ2388979FDOQ2388979
Authors: Yanyan Li
Publication date: 22 July 2009
Published in: The Annals of Statistics (Search for Journal in Brave)
Abstract: We consider the problem of model selection and estimation in situations where the number of parameters diverges with the sample size. When the dimension is high, an ideal method should have the oracle property [J. Amer. Statist. Assoc. 96 (2001) 1348--1360] and [Ann. Statist. 32 (2004) 928--961] which ensures the optimal large sample performance. Furthermore, the high-dimensionality often induces the collinearity problem, which should be properly handled by the ideal method. Many existing variable selection methods fail to achieve both goals simultaneously. In this paper, we propose the adaptive elastic-net that combines the strengths of the quadratic regularization and the adaptively weighted lasso shrinkage. Under weak regularity conditions, we establish the oracle property of the adaptive elastic-net. We show by simulations that the adaptive elastic-net deals with the collinearity problem better than the other oracle-like methods, thus enjoying much improved finite sample performance.
Full work available at URL: https://arxiv.org/abs/0908.1836
Recommendations
- On the oracle property of a generalized adaptive elastic-net for multivariate linear regression with a diverging number of parameters
- A group adaptive elastic-net approach for variable selection in high-dimensional linear regression
- The adaptive gril estimator with a diverging number of parameters
- Adaptive lasso for generalized linear models with a diverging number of parameters
- Regularization and Variable Selection Via the Elastic Net
model selectionhigh dimensionalityoracle propertyshrinkage methodselastic-netadaptive regularization
Cites Work
- Heuristics of instability and stabilization in model selection
- The Adaptive Lasso and Its Oracle Properties
- Least angle regression. (With discussion)
- Ideal spatial adaptation by wavelet shrinkage
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Title not available (Why is that?)
- Robust regression: Asymptotics, conjectures and Monte Carlo
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Title not available (Why is that?)
- Regularization and Variable Selection Via the Elastic Net
- Enhancing sparsity by reweighted \(\ell _{1}\) minimization
- Asymptotics for Lasso-type estimators.
- Tuning parameter selectors for the smoothly clipped absolute deviation method
- Nonconcave penalized likelihood with a diverging number of parameters.
- Statistical challenges with high dimensionality: feature selection in knowledge discovery
- Asymptotic behavior of M-estimators of p regression parameters when \(p^ 2/n\) is large. I. Consistency
- On the ``degrees of freedom of the lasso
- Semilinear High-Dimensional Model for Normalization of Microarray Data
- Profile-kernel likelihood inference with diverging number of parameters
Cited In (only showing first 100 items - show all)
- High-dimensional linear regression with hard thresholding regularization: theory and algorithm
- Regression adjustment for treatment effect with multicollinearity in high dimensions
- Regularization-based bootstrap ranking model: identifying healthcare indicators among all level income economies
- A consistent and numerically efficient variable selection method for sparse Poisson regression with applications to learning and signal recovery
- Penalized profile least squares-based statistical inference for varying coefficient partially linear errors-in-variables models
- Modified SCAD penalty for constrained variable selection problems
- Nonconvex penalized ridge estimations for partially linear additive models in ultrahigh dimension
- A new data adaptive elastic net predictive model using hybridized smoothed covariance estimators with information complexity
- A generalized bridge regression in fuzzy environment and its numerical solution by a capable recurrent neural network
- On Hodges' superefficiency and merits of oracle property in model selection
- Model selection via standard error adjusted adaptive Lasso
- Informative gene selection for microarray classification via adaptive elastic net with conditional mutual information
- Convex and non-convex regularization methods for spatial point processes intensity estimation
- Variable selection and collinearity processing for multivariate data via row-elastic-net regularization
- The adaptive gril estimator with a diverging number of parameters
- Double fused Lasso regularized regression with both matrix and vector valued predictors
- On the strong oracle property of concave penalized estimators with infinite penalty derivative at the origin
- Robust variable selection for finite mixture regression models
- Stable prediction in high-dimensional linear models
- Cluster feature selection in high-dimensional linear models
- Model-free feature screening via a modified composite quantile correlation
- Grouping Variable Selection by Weight Fused Elastic Net for Multi-Collinear Data
- Adaptive fused LASSO in grouped quantile regression
- Hierarchically penalized additive hazards model with diverging number of parameters
- Robust group identification and variable selection in regression
- Variable selection and parameter estimation with the Atan regularization method
- Bi-level variable selection via adaptive sparse group Lasso
- Testing for Neglected Nonlinearity Using Regularized Artificial Neural Networks
- Penalized time-varying model averaging
- Multi-step adaptive elastic-net: reducing false positives in high-dimensional variable selection
- Least-Square Approximation for a Distributed System
- Accelerating the distance-minimizing method for data-driven elasticity with adaptive hyperparameters
- Group identification and variable selection in quantile regression
- Sparsity constrained estimation in image processing and computer vision
- On the oracle property of a generalized adaptive elastic-net for multivariate linear regression with a diverging number of parameters
- Model averaging with covariates that are missing completely at random
- The adaptive BerHu penalty in robust regression
- Variable selection with spatially autoregressive errors: a generalized moments Lasso estimator
- Variable selection and estimation using a continuous approximation to the \(L_0\) penalty
- Generalized F-test for high dimensional regression coefficients of partially linear models
- Regression with adaptive Lasso and correlation based penalty
- Robust elastic net estimators for variable selection and identification of proteomic biomarkers
- A group adaptive elastic-net approach for variable selection in high-dimensional linear regression
- Asymptotic theory of the adaptive sparse group Lasso
- Truncated \(L_1\) regularized linear regression: theory and algorithm
- Multi-task sparse identification for closed-loop systems with general observation sequences
- Adaptive-to-model checking for regressions with diverging number of predictors
- SCAD-penalized least absolute deviation regression in high-dimensional models
- Generalization error bounds of dynamic treatment regimes in penalized regression-based learning
- Efficient penalized estimation for linear regression model
- Two-step adaptive model selection for vector autoregressive processes
- Inference for low‐ and high‐dimensional inhomogeneous Gibbs point processes
- Adaptive and reversed penalty for analysis of high-dimensional correlated data
- Feature screening via distance correlation learning
- Data mining for longitudinal data under multicollinearity and time dependence using penalized generalized estimating equations
- Consistent tuning parameter selection in high-dimensional group-penalized regression
- Robust variable selection for generalized linear models with a diverging number of parameters
- A sparse additive model for high-dimensional interactions with an exposure variable
- On the oracle property of adaptive group Lasso in high-dimensional linear models
- Doubly robust weighted composite quantile regression based on SCAD‐L2
- Variable selection in high-dimensional linear model with possibly asymmetric errors
- Generalized co-sparse factor regression
- Smooth LASSO estimator for the function-on-function linear regression model
- The smooth-Lasso and other \(\ell _{1}+\ell _{2}\)-penalized methods
- Majorization-minimization algorithms for nonsmoothly penalized objective functions
- Manifold elastic net: a unified framework for sparse dimension reduction
- The use of random-effect models for high-dimensional variable selection problems
- Sparse reduced-rank regression for simultaneous dimension reduction and variable selection
- Penalized empirical likelihood for high-dimensional partially linear varying coefficient model with measurement errors
- Nonnegative adaptive Lasso for ultra-high dimensional regression models and a two-stage method applied in financial modeling
- Profiled adaptive elastic-net procedure for partially linear models with high-dimensional covar\-i\-ates
- Balanced estimation for high-dimensional measurement error models
- Group variable selection via SCAD-L2
- Robust variable selection in semiparametric mean-covariance regression for longitudinal data analysis
- Model selection consistency of Lasso for empirical data
- The finite sample properties of sparse M-estimators with pseudo-observations
- A majorization-minimization approach to variable selection using spike and slab priors
- Weighted elastic net penalized mean-variance portfolio design and computation
- Endogeneity in high dimensions
- Fast and accurate variational inference for large Bayesian VARs with stochastic volatility
- Model selection and structure specification in ultra-high dimensional generalised semi-varying coefficient models
- Penalized regression models with autoregressive error terms
- Variable selection for survival data with a class of adaptive elastic net techniques
- Efficient regularized regression with \(L_0\) penalty for variable selection and network construction
- Shrinkage tuning parameter selection with a diverging number of parameters
- Subset selection for vector autoregressive processes via adaptive Lasso
- Nonnegative estimation and variable selection under minimax concave penalty for sparse high-dimensional linear regression models
- Variable selection in linear mixed effects models
- \(\ell_0\)-regularized high-dimensional accelerated failure time model
- Variable selection in linear mixed models using an extended class of penalties
- An improved variable selection procedure for adaptive Lasso in high-dimensional survival analysis
- Quadratic approximation for nonconvex penalized estimations with a diverging number of parameters
- On the grouped selection and model complexity of the adaptive elastic net
- Variable selection for varying-coefficient models with the sparse regularization
- Exponentially tilted likelihood inference on growing dimensional unconditional moment models
- Some properties of generalized fused Lasso and its applications to high dimensional data
- Adaptive group Lasso selection in quantile models
- Variable Selection with Multiply-Imputed Datasets: Choosing Between Stacked and Grouped Methods
- On model selection consistency of the elastic net when \(p \gg n\)
- Two tales of variable selection for high dimensional regression: Screening and model building
This page was built for publication: On the adaptive elastic net with a diverging number of parameters
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2388979)