AIC for the Lasso in generalized linear models
DOI10.1214/16-EJS1179zbMath1347.62145OpenAlexW2510343895MaRDI QIDQ315399
Shuichi Kawano, Yoshiyuki Ninomiya
Publication date: 21 September 2016
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://projecteuclid.org/euclid.ejs/1473431413
Kullback-Leibler divergencevariable selectioninformation criteriontuning parameterconvexity lemmastatistical asymptotic theory
Asymptotic properties of parametric estimators (62F12) Ridge regression; shrinkage estimators (Lasso) (62J07) Asymptotic distribution theory in statistics (62E20) Generalized linear models (logistic models) (62J12)
Related Items (6)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Sparse inverse covariance estimation with the graphical lasso
- The Adaptive Lasso and Its Oracle Properties
- Exact post-selection inference, with application to the Lasso
- Cox's regression model for counting processes: A large sample study
- The solution path of the generalized lasso
- Lasso-type recovery of sparse representations for high-dimensional data
- Estimation of the mean of a multivariate normal distribution
- Asymptotics for generalized estimating equations with large cluster sizes
- Asymptotics for Lasso-type estimators.
- Least angle regression. (With discussion)
- Rejoinder: ``A significance test for the lasso
- Optimal two-step prediction in regression
- High-dimensional generalized linear models and the lasso
- Aggregation for Gaussian regression
- Information criteria and statistical modeling.
- On the ``degrees of freedom of the lasso
- Shrinkage Tuning Parameter Selection with a Diverging number of Parameters
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- Scaled sparse linear regression
- Model selection and estimation in the Gaussian graphical model
- Regression and time series model selection in small samples
- Cross-validation and multinomial prediction
- Further analysis of the data by Akaike's information criterion and the finite corrections
- Generalised information criteria in model selection
- The Focused Information Criterion
- Bayesian Measures of Model Complexity and Fit
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Regularization Parameter Selections via Generalized Information Criterion
- Convex Analysis
- On Information and Sufficiency
This page was built for publication: AIC for the Lasso in generalized linear models