L0-Regularized Learning for High-Dimensional Additive Hazards Regression
From MaRDI portal
Publication:5058017
DOI10.1287/IJOC.2022.1208OpenAlexW4282842776MaRDI QIDQ5058017FDOQ5058017
Authors:
Publication date: 1 December 2022
Published in: INFORMS Journal on Computing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1287/ijoc.2022.1208
Recommendations
- \(\ell_0\)-regularized high-dimensional accelerated failure time model
- Bi-selection in the high-dimensional additive hazards regression model
- Variable selection and structure estimation for ultrahigh-dimensional additive hazards models
- Reproducible feature selection in high-dimensional accelerated failure time models
- Efficient regularized regression with \(L_0\) penalty for variable selection and network construction
model selection consistencysurvival data analysishigh-dimensional features\(L_0\)-regularized learningglobal and local optimizersprimal dual active sets
Cites Work
- Nearly unbiased variable selection under minimax concave penalty
- Random survival forests
- A unified approach to model selection and sparse recovery using regularized least squares
- Least angle regression. (With discussion)
- Title not available (Why is that?)
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Title not available (Why is that?)
- Statistics for high-dimensional data. Methods, theory and applications.
- One-step sparse estimates in nonconcave penalized likelihood models
- Simultaneous analysis of Lasso and Dantzig selector
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Best subset selection via a modern optimization lens
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Title not available (Why is that?)
- High-Dimensional Variable Selection for Survival Data
- Regularization and Variable Selection Via the Elastic Net
- Title not available (Why is that?)
- Penalized high-dimensional empirical likelihood
- Scaled sparse linear regression
- A Selective Overview of Variable Selection in High Dimensional Feature Space (Invited Review Article)
- Program evaluation and causal inference with high-dimensional data
- Cox's regression model for counting processes: A large sample study
- Risk bounds for model selection via penalization
- Panning for Gold: ‘Model-X’ Knockoffs for High Dimensional Controlled Variable Selection
- Tuning Parameter Selection in High Dimensional Penalized Likelihood
- Innovated scalable efficient estimation in ultra-large Gaussian graphical models
- Extended Bayesian information criterion in the Cox model with a high-dimensional feature space
- A primal dual active set with continuation algorithm for the \(\ell^0\)-regularized optimization problem
- Semiparametric analysis of the additive risk model
- Covariate selection for the semiparametric additive risk model
- On Cox processes and credit risky securities
- A partly parametric additive risk model
- Variance Estimation Using Refitted Cross-Validation in Ultrahigh Dimensional Regression
- Oracle inequalities and selection consistency for weighted Lasso in high-dimensional additive hazards model
- High-Dimensional Sparse Additive Hazards Regression
- High Dimensional Thresholded Regression and Shrinkage Effect
- Variable inclusion and shrinkage algorithms
- Title not available (Why is that?)
- Faster Kriging: facing high-dimensional simulators
- Statistical optimization in high dimensions
- High-dimensional macroeconomic forecasting and variable selection via penalized regression
- Convex optimization for group feature selection in networked data
- A polynomial algorithm for best-subset selection problem
- Nonsparse learning with latent variables
- Understanding the impact of individual users' rating characteristics on the predictive accuracy of recommender systems
- Variable screening for survival data in the presence of heterogeneous censoring
Cited In (5)
- Best subset selection with shrinkage: sparse additive hazards regression with the grouping effect
- High-dimensional additive hazards models and the lasso
- CoxKnockoff: controlled feature selection for the Cox model using knockoffs
- High-Dimensional Sparse Additive Hazards Regression
- Cutting-plane algorithm for estimation of sparse Cox proportional hazards models
Uses Software
This page was built for publication: L0-Regularized Learning for High-Dimensional Additive Hazards Regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5058017)