L0-Regularized Learning for High-Dimensional Additive Hazards Regression
From MaRDI portal
Publication:5058017
DOI10.1287/ijoc.2022.1208OpenAlexW4282842776MaRDI QIDQ5058017
No author found.
Publication date: 1 December 2022
Published in: INFORMS Journal on Computing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1287/ijoc.2022.1208
model selection consistencysurvival data analysishigh-dimensional features\(L_0\)-regularized learningglobal and local optimizersprimal dual active sets
Related Items (1)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Random survival forests
- Nearly unbiased variable selection under minimax concave penalty
- A unified approach to model selection and sparse recovery using regularized least squares
- Best subset selection via a modern optimization lens
- Innovated scalable efficient estimation in ultra-large Gaussian graphical models
- On Cox processes and credit risky securities
- Statistics for high-dimensional data. Methods, theory and applications.
- Cox's regression model for counting processes: A large sample study
- A primal dual active set with continuation algorithm for the \(\ell^0\)-regularized optimization problem
- One-step sparse estimates in nonconcave penalized likelihood models
- Risk bounds for model selection via penalization
- Least angle regression. (With discussion)
- Extended Bayesian information criterion in the Cox model with a high-dimensional feature space
- Simultaneous analysis of Lasso and Dantzig selector
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Statistical Optimization in High Dimensions
- Penalized high-dimensional empirical likelihood
- Variable Inclusion and Shrinkage Algorithms
- Covariate Selection for the Semiparametric Additive Risk Model
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Scaled sparse linear regression
- Understanding the Impact of Individual Users’ Rating Characteristics on the Predictive Accuracy of Recommender Systems
- A Selective Overview of Variable Selection in High Dimensional Feature Space (Invited Review Article)
- Semiparametric analysis of the additive risk model
- A partly parametric additive risk model
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Program Evaluation and Causal Inference With High-Dimensional Data
- Variance Estimation Using Refitted Cross-Validation in Ultrahigh Dimensional Regression
- High-Dimensional Sparse Additive Hazards Regression
- Panning for Gold: ‘Model-X’ Knockoffs for High Dimensional Controlled Variable Selection
- Nonsparse Learning with Latent Variables
- Variable screening for survival data in the presence of heterogeneous censoring
- A polynomial algorithm for best-subset selection problem
- High-dimensional macroeconomic forecasting and variable selection via penalized regression
- Faster Kriging: Facing High-Dimensional Simulators
- Convex Optimization for Group Feature Selection in Networked Data
- High-Dimensional Variable Selection for Survival Data
- Regularization and Variable Selection Via the Elastic Net
- Tuning Parameter Selection in High Dimensional Penalized Likelihood
- High Dimensional Thresholded Regression and Shrinkage Effect
This page was built for publication: L0-Regularized Learning for High-Dimensional Additive Hazards Regression