Selection of tuning parameters in bridge regression models via Bayesian information criterion
From MaRDI portal
Publication:465645
DOI10.1007/S00362-013-0561-7zbMATH Open1297.62152arXiv1203.4326OpenAlexW3105452317MaRDI QIDQ465645FDOQ465645
Publication date: 24 October 2014
Published in: Statistical Papers (Search for Journal in Brave)
Abstract: We consider the bridge linear regression modeling, which can produce a sparse or non-sparse model. A crucial point in the model building process is the selection of adjusted parameters including a regularization parameter and a tuning parameter in bridge regression models. The choice of the adjusted parameters can be viewed as a model selection and evaluation problem. We propose a model selection criterion for evaluating bridge regression models in terms of Bayesian approach. This selection criterion enables us to select the adjusted parameters objectively. We investigate the effectiveness of our proposed modeling strategy through some numerical examples.
Full work available at URL: https://arxiv.org/abs/1203.4326
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Nearly unbiased variable selection under minimax concave penalty
- A unified approach to model selection and sparse recovery using regularized least squares
- Estimating the dimension of a model
- Flexible smoothing with \(B\)-splines and penalties. With comments and a rejoinder by the authors
- Regression and time series model selection in small samples
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Statistics for high-dimensional data. Methods, theory and applications.
- A new look at the statistical model identification
- One-step sparse estimates in nonconcave penalized likelihood models
- Smoothing noisy data with spline functions: Estimating the correct degree of smoothing by the method of generalized cross-validation
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Regularization and Variable Selection Via the Elastic Net
- Model Selection and Estimation in Regression with Grouped Variables
- Ridge Regression: Biased Estimation for Nonorthogonal Problems
- Asymptotics for Lasso-type estimators.
- Accurate Approximations for Posterior Moments and Marginal Densities
- A group bridge approach for variable selection
- A Statistical View of Some Chemometrics Regression Tools
- Asymptotic properties of bridge estimators in sparse high-dimensional regression models
- Further analysis of the data by Akaike's information criterion and the finite corrections
- Penalized likelihood regression for generalized linear models with non-quadratic penalties
- Smoothing Parameter Selection in Nonparametric Regression Using an Improved Akaike Information Criterion
- Bayesian information criteria and smoothing parameter selection in radial basis function networks
- Penalized estimation in additive varying coefficient models using grouped regularization
- Support vector machines with adaptive \(L_q\) penalty
- Bootstrapping log likelihood and EIC, an extension of AIC
- Bridge regression: adaptivity and group selection
- The Bayesian Bridge
- Erratum to: ``Ultrahigh dimensional variable selection through the penalized maximum trimmed likelihood estimator
Cited In (5)
- Deterministic bridge regression for compressive classification
- Bayesian information criterion approximations to Bayes factors for univariate and multivariate logistic regression models
- Degrees of freedom for regularized regression with Huber loss and linear constraints
- The minimum \(S\)-divergence estimator under continuous models: the Basu-Lindsay approach
- Penalized MM regression estimation withLΞ³penalty: a robust version of bridge regression
Recommendations
- Title not available (Why is that?) π π
- Title not available (Why is that?) π π
- Title not available (Why is that?) π π
- Title not available (Why is that?) π π
- Criteria for Bayesian model choice with application to variable selection π π
- Model Selection via Bayesian Information Criterion for Quantile Regression Models π π
- Selection of tuning parameters, solution paths and standard errors for Bayesian Lassos π π
- Performance of Variable Selection Methods in Regression Using Variations of the Bayesian Information Criterion π π
- Bayesian bridge regression π π
- On model selection in Bayesian regression π π
This page was built for publication: Selection of tuning parameters in bridge regression models via Bayesian information criterion
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q465645)