The Adaptive Lasso and Its Oracle Properties
DOI10.1198/016214506000000735zbMATH Open1171.62326OpenAlexW2020925091WikidataQ105584228 ScholiaQ105584228MaRDI QIDQ147375FDOQ147375
Publication date: 1 December 2006
Published in: Journal of the American Statistical Association (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1198/016214506000000735
Recommendations
- Adaptive Lasso for sparse high-dimensional regression models
- Adaptive lasso for generalized linear models with a diverging number of parameters
- scientific article; zbMATH DE number 6179253
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Lasso with convex loss: model selection consistency and estimation
Nonparametric estimation (62G05) Nonparametric regression and quantile regression (62G08) Asymptotic properties of nonparametric inference (62G20) Generalized linear models (logistic models) (62J12)
Cited In (only showing first 100 items - show all)
- High-dimensional inference: confidence intervals, \(p\)-values and R-software \texttt{hdi}
- Feature screening via distance correlation learning
- Change-point detection in high-dimensional covariance structure
- Bayesian Lasso binary quantile regression
- Variable selection for high dimensional Gaussian copula regression model: an adaptive hypothesis testing procedure
- A simple method for estimating interactions between a treatment and a large number of covariates
- Hierarchical inference for genome-wide association studies: a view on methodology with software
- Brq: an R package for Bayesian quantile regression
- Inferring sparse Gaussian graphical models with latent structure
- On the conditions used to prove oracle results for the Lasso
- SGL-SVM: a novel method for tumor classification via support vector machine with sparse group lasso
- Variance prior forms for high-dimensional Bayesian variable selection
- Sparse estimation of conditional graphical models with application to gene networks
- Factor selection and structural identification in the interaction ANOVA model
- Shrinkage priors for Bayesian penalized regression
- Feature selection for varying coefficient models with ultrahigh-dimensional covariates
- Sparse additive ordinary differential equations for dynamic gene regulatory network modeling
- ESL-SELO: a robust image denoising algorithm with penalty
- IPF-LASSO: integrative \(L_1\)-penalized regression with penalty factors for prediction based on multi-omics data
- Generalized alternating direction method of multipliers: new theoretical insights and applications
- Variable selection for high dimensional partially linear varying coefficient errors-in-variables models
- Majorization-minimization algorithms for nonsmoothly penalized objective functions
- Sign-constrained least squares estimation for high-dimensional regression
- Parallel integrative learning for large-scale multi-response regression with incomplete outcomes
- Stability of feature selection in classification issues for high-dimensional correlated data
- Penalized model-based clustering
- Adaptive Lasso estimators for ultrahigh dimensional generalized linear models
- Robust sparse Gaussian graphical modeling
- Sparse semiparametric discriminant analysis
- Network exploration via the adaptive LASSO and SCAD penalties
- Analyzing large datasets with bootstrap penalization
- A uniform framework for the combination of penalties in generalized structured models
- Adaptive robust variable selection
- Sparse nonparametric model for regression with functional covariate
- Endogeneity in high dimensions
- Sparse regression with multi-type regularized feature modeling
- Simultaneous Factor Selection and Collapsing Levels in ANOVA
- Shrinkage tuning parameter selection with a diverging number of parameters
- Variable Selection for Model-Based High-Dimensional Clustering and Its Application to Microarray Data
- Variable selection in high-dimensional partly linear additive models
- Interaction Model and Model Selection for Function-on-Function Regression
- False Discovery Rate Smoothing
- The group exponential Lasso for bi-level variable selection
- Variable selection in discrete survival models including heterogeneity
- Sure independence screening in generalized linear models with NP-dimensionality
- Conducting sparse feature selection on arbitrarily long phrases in text corpora with a focus on interpretability
- Nearly unbiased variable selection under minimax concave penalty
- Estimator selection in the Gaussian setting
- Covariate Selection in High-Dimensional Generalized Linear Models With Measurement Error
- Exact spike train inference via \(\ell_{0}\) optimization
- Quadratic Majorization for Nonconvex Loss with Applications to the Boosting Algorithm
- RandGA: injecting randomness into parallel genetic algorithm for variable selection
- A unified approach to model selection and sparse recovery using regularized least squares
- Constructing networks by filtering correlation matrices: a null model approach
- Complete subset regressions
- Sparse classification with paired covariates
- Ranked sparsity: a cogent regularization framework for selecting and estimating feature interactions and polynomials
- Stability Selection
- Leveraging mixed and incomplete outcomes via reduced-rank modeling
- Identification of biomarker‐by‐treatment interactions in randomized clinical trials with survival outcomes and high‐dimensional spaces
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Correlated variables in regression: clustering and sparse estimation
- Best subset selection via a modern optimization lens
- Multicategory large margin classification with unequal costs
- A sparse conditional Gaussian graphical model for analysis of genetical genomics data
- Simultaneous multiple response regression and inverse covariance matrix estimation via penalized Gaussian maximum likelihood
- Sparse estimators and the oracle property, or the return of Hodges' estimator
- High-dimensional additive modeling
- Sparse modeling of categorial explanatory variables
- \(\ell_{0}\)-penalized maximum likelihood for sparse directed acyclic graphs
- Rates of convergence of the adaptive LASSO estimators to the oracle distribution and higher order refinements by the bootstrap
- Statistical significance in high-dimensional linear models
- Variable selection for high-dimensional varying coefficient partially linear models via nonconcave penalty
- Sparse recovery under matrix uncertainty
- Variable selection and regression analysis for graph-structured covariates with an application to genomics
- Confidence Intervals for Low Dimensional Parameters in High Dimensional Linear Models
- Sparse estimation in functional linear regression
- Boosting algorithms: regularization, prediction and model fitting
- Oracle inequalities for high dimensional vector autoregressions
- Select the valid and relevant moments: an information-based Lasso for GMM with many moments
- Enhancing sparsity by reweighted \(\ell _{1}\) minimization
- Regularized rank-based estimation of high-dimensional nonparanormal graphical models
- A fast unified algorithm for solving group-lasso penalize learning problems
- \(\ell_{1}\)-penalization for mixture regression models
- Autoregressive process modeling via the Lasso procedure
- Lasso-type recovery of sparse representations for high-dimensional data
- Structured sparsity through convex optimization
- High-dimensional variable selection
- Estimating the dimension of a model
- Selection of fixed effects in high dimensional linear mixed models using a multicycle ECM algorithm
- Penalised inference for lagged dependent regression in the presence of autocorrelated residuals
- High-dimensional multivariate posterior consistency under global-local shrinkage priors
- Gaussian model selection with an unknown variance
- Subset selection for vector autoregressive processes using Lasso
- Sparsistency and rates of convergence in large covariance matrix estimation
- Feature selection and tumor classification for microarray data using relaxed Lasso and generalized multi-class support vector machine
- ArCo: an artificial counterfactual approach for high-dimensional panel time-series data
- Sparse least trimmed squares regression for analyzing high-dimensional large data sets
- Variable and boundary selection for functional data via multiclass logistic regression modeling
- Spatial variable selection and an application to Virginia Lyme disease emergence
Uses Software
This page was built for publication: The Adaptive Lasso and Its Oracle Properties
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q147375)