Honest variable selection in linear and logistic regression models via \(\ell _{1}\) and \(\ell _{1}+\ell _{2}\) penalization
From MaRDI portal
Publication:1951794
DOI10.1214/08-EJS287zbMath1320.62170arXiv0808.4051OpenAlexW3105629641MaRDI QIDQ1951794
Publication date: 24 May 2013
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/0808.4051
logistic regressionregressionvariable selectiongeneralized linear modelssparsehigh dimensionslassopenaltyelastic netconsistent\(\ell_1\) and \(\ell_1+\ell_2\) regularization
Nonparametric regression and quantile regression (62G08) Ridge regression; shrinkage estimators (Lasso) (62J07) Generalized linear models (logistic models) (62J12) General nonlinear regression (62J02)
Related Items
A general family of trimmed estimators for robust high-dimensional data analysis, Simultaneous variable selection and de-coarsening in multi-path change-point models, The information detection for the generalized additive model, Elastic-net Regularized High-dimensional Negative Binomial Regression: Consistency and Weak Signal Detection, Adaptive log-density estimation, Sliding window strategy for convolutional spike sorting with Lasso. Algorithm, theoretical guarantees and complexity, Nonlinear Variable Selection via Deep Neural Networks, Weighted Lasso estimates for sparse logistic regression: non-asymptotic properties with measurement errors, A note on the asymptotic distribution of lasso estimator for correlated data, Overlapping group lasso for high-dimensional generalized linear models, On estimation error bounds of the Elastic Net when p ≫ n, The degrees of freedom of partly smooth regularizers, Asymptotic properties of Lasso+mLS and Lasso+Ridge in sparse high-dimensional linear regression, Non-negative least squares for high-dimensional linear models: consistency and sparse recovery without regularization, Discussion of ``Correlated variables in regression: clustering and sparse estimation, Group sparse recovery via group square-root elastic net and the iterative multivariate thresholding-based algorithm, A Critical Review of LASSO and Its Derivatives for Variable Selection Under Dependence Among Covariates, Lasso regression in sparse linear model with \(\varphi\)-mixing errors, A consistent algorithm to solve Lasso, elastic-net and Tikhonov regularization, Variable selection for sparse logistic regression, Estimating networks with jumps, Lasso, iterative feature selection and the correlation selector: oracle inequalities and numerical performances, Dimension reduction and variable selection in case control studies via regularized likelihood optimization, Self-concordant analysis for logistic regression, The smooth-Lasso and other \(\ell _{1}+\ell _{2}\)-penalized methods, Unnamed Item, Unnamed Item, SPADES and mixture models, A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers, Graphical-model based high dimensional generalized linear models, Joint variable and rank selection for parsimonious estimation of high-dimensional matrices, Concentration Inequalities for Statistical Inference, New estimation and feature selection methods in mixture-of-experts models, Tuning parameter calibration for \(\ell_1\)-regularized logistic regression, On model selection consistency of regularized M-estimators, Innovated interaction screening for high-dimensional nonlinear classification, The first-order necessary conditions for sparsity constrained optimization
Cites Work
- Unnamed Item
- The Adaptive Lasso and Its Oracle Properties
- Near-ideal model selection by \(\ell _{1}\) minimization
- High-dimensional variable selection
- Sparsity in penalized empirical risk minimization
- Best subset selection, persistence in high-dimensional statistical learning and optimization under \(l_1\) constraint
- High-dimensional Ising model selection using \(\ell _{1}\)-regularized logistic regression
- Lasso-type recovery of sparse representations for high-dimensional data
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional generalized linear models and the lasso
- Lasso type classifiers with a reject option
- Sparsity oracle inequalities for the Lasso
- Sup-norm convergence rate and sign concentration property of Lasso and Dantzig estimators
- Aggregation for Gaussian regression
- High-dimensional graphs and variable selection with the Lasso
- How to compare different loss functions and their risks
- Stable recovery of sparse overcomplete representations in the presence of noise
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- A new approach to variable selection in least squares problems
- Information-Theoretic Limits on Sparsity Recovery in the High-Dimensional and Noisy Setting
- Regularization and Variable Selection Via the Elastic Net
- Combinatorial methods in density estimation