Efficiency for Regularization Parameter Selection in Penalized Likelihood Estimation of Misspecified Models
From MaRDI portal
Publication:2861816
DOI10.1080/01621459.2013.801775OpenAlexW3103473643MaRDI QIDQ2861816
Cheryl J. Flynn, Clifford M. Hurvich, Jeffrey S. Simonoff
Publication date: 11 November 2013
Published in: Journal of the American Statistical Association (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1302.2068
regularization methodsAkaike information criterion (AIC)least absolute shrinkage and selection operator (Lasso)smoothly clipped absolute deviation (SCAD)model selection/variable selection
Related Items (10)
Selecting the regularization parameters in high-dimensional panel data models: Consistency and efficiency ⋮ Complete subset averaging approach for high-dimensional generalized linear models ⋮ Frequentist model averaging for envelope models ⋮ Variable selection in linear-circular regression models ⋮ Model averaging estimator in ridge regression and its large sample properties ⋮ On the sensitivity of the Lasso to the number of predictor variables ⋮ A study on tuning parameter selection for the high-dimensional lasso ⋮ A Tailored Multivariate Mixture Model for Detecting Proteins of Concordant Change Among Virulent Strains of Clostridium Perfringens ⋮ Model averaging prediction for nonparametric varying-coefficient models with B-spline smoothing ⋮ Optimal model averaging for divergent-dimensional Poisson regressions
Uses Software
Cites Work
- Unnamed Item
- Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection
- RE-EM trees: a data mining approach for longitudinal and clustered data
- The Adaptive Lasso and Its Oracle Properties
- Bayesian statistics then and now
- Asymptotic optimality for \(C_ p\), \(C_ L\), cross-validation and generalized cross-validation: Discrete index set
- Asymptotically efficient selection of the order of the model for estimating parameters of a linear process
- Smoothing noisy data with spline functions: Estimating the correct degree of smoothing by the method of generalized cross-validation
- Estimating the dimension of a model
- Maximum likelihood principle and model selection when the true model is unspecified
- Nonconcave penalized likelihood with a diverging number of parameters.
- On the ``degrees of freedom of the lasso
- Can the strengths of AIC and BIC be shared? A conflict between model indentification and regression estimation
- Regression and time series model selection in small samples
- An optimal selection of regression variables
- Model Selection for Extended Quasi-Likelihood Models in Small Samples
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Model Selection and Multimodel Inference
- Regularization Parameter Selections via Generalized Information Criterion
- Regularization and Variable Selection Via the Elastic Net
- Tuning parameter selectors for the smoothly clipped absolute deviation method
- Maximum Likelihood Estimation of Misspecified Models
This page was built for publication: Efficiency for Regularization Parameter Selection in Penalized Likelihood Estimation of Misspecified Models