An overview of reciprocal L₁-regularization for high dimensional regression data
From MaRDI portal
Publication:6602178
DOI10.1002/WICS.1416zbMATH Open1544.62137MaRDI QIDQ6602178FDOQ6602178
Authors: Qifan Song
Publication date: 11 September 2024
Published in: Wiley Interdisciplinary Reviews. WIREs Computational Statistics (Search for Journal in Brave)
Cites Work
- Sure independence screening in generalized linear models with NP-dimensionality
- Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection
- Nearly unbiased variable selection under minimax concave penalty
- Estimating the dimension of a model
- The Adaptive Lasso and Its Oracle Properties
- Least angle regression. (With discussion)
- Extended Bayesian information criteria for model selection with large model spaces
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Title not available (Why is that?)
- A new look at the statistical model identification
- One-step sparse estimates in nonconcave penalized likelihood models
- Lasso-type recovery of sparse representations for high-dimensional data
- Title not available (Why is that?)
- The Bayesian Lasso
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Stochastic Approximation in Monte Carlo Computation
- Extended BIC for small-\(n\)-large-\(P\) sparse GLM
- Linear Model Selection by Cross-Validation
- Shotgun Stochastic Search for “Largep” Regression
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Bayesian variable selection with shrinking and diffusing priors
- Local operator theory, random matrices and Banach spaces.
- Likelihood-based selection and sharp parameter estimation
- Bayesian model selection in high-dimensional settings
- Bayesian Subset Modeling for High-Dimensional Generalized Linear Models
- Adaptive estimation of a quadratic functional by model selection.
- Cross-validation for selecting a model selection procedure
- On the computational complexity of high-dimensional Bayesian variable selection
- A split-and-merge Bayesian variable selection approach for ultrahigh dimensional regression
- High-dimensional variable selection with reciprocal \(L_{1}\)-regularization
- Some connections between Bayesian and non-Bayesian methods for regression model selection
- Simulated annealing process in general state space
- Simulated stochastic approximation annealing for global optimization with a square-root cooling schedule
Cited In (1)
This page was built for publication: An overview of reciprocal \(L_1\)-regularization for high dimensional regression data
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6602178)