Optimal subsampling for parametric accelerated failure time models with massive survival data
From MaRDI portal
Publication:6629381
Cites work
- A split-and-conquer approach for analysis of
- A split-and-merge Bayesian variable selection approach for ultrahigh dimensional regression
- A statistical perspective on algorithmic leveraging
- An online updating approach for testing the proportional hazards assumption with streams of survival data
- Bayesian accelerated failure time model for correlated interval-censored data with a normal mixture as error distribution
- Multivariate survival analysis in big data: A divide‐and‐combine approach
- Online Updating of Survival Analysis
- Online updating method with new variables for big data streams
- Optimal subsampling algorithms for big data regressions
- Optimal subsampling for large sample logistic regression
- Optimal subsampling for quantile regression in big data
- Optimum experimental designs, with SAS
- Sampling algorithms for \(l_2\) regression and applications
- Sampling-based estimation for massive survival data with additive hazards model
- Statistical methods and computing for big data
- Weighted Average Importance Sampling and Defensive Mixture Distributions
Cited in
(3)
This page was built for publication: Optimal subsampling for parametric accelerated failure time models with massive survival data
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6629381)