Learning from a lot: empirical Bayes for high-dimensional model-based prediction
From MaRDI portal
Publication:4629271
Abstract: Empirical Bayes is a versatile approach to `learn from a lot' in two ways: first, from a large number of variables and second, from a potentially large amount of prior information, e.g. stored in public repositories. We review applications of a variety of empirical Bayes methods to several well-known model-based prediction methods including penalized regression, linear discriminant analysis, and Bayesian models with sparse or dense priors. We discuss `formal' empirical Bayes methods which maximize the marginal likelihood, but also more informal approaches based on other data summaries. We contrast empirical Bayes to cross-validation and full Bayes, and discuss hybrid approaches. To study the relation between the quality of an empirical Bayes estimator and , the number of variables, we consider a simple empirical Bayes estimator in a linear model setting. We argue that empirical Bayes is particularly useful when the prior contains multiple parameters which model a priori information on variables, termed `co-data'. In particular, we present two novel examples that allow for co-data. First, a Bayesian spike-and-slab setting that facilitates inclusion of multiple co-data sources and types; second, a hybrid empirical Bayes-full Bayes ridge regression approach for estimation of the posterior predictive interval.
Recommendations
- Empirical priors for prediction in sparse high-dimensional linear regression
- Empirical Bayes estimates for large-scale prediction problems
- Empirical Bayes posterior concentration in sparse high-dimensional linear models
- Empirical Bayes regression analysis with many regressors but fewer observations
- Efficient Empirical Bayes Variable Selection and Estimation in Linear Models
Cites work
- scientific article; zbMATH DE number 795286 (Why is no real title available?)
- A review of Bayesian variable selection methods: what, how and which
- Approximate Bayesian inference for latent Gaussian models by using integrated nested Laplace approximations (with discussion)
- Asymptotic behaviour of the empirical Bayes posteriors associated to maximum marginal likelihood estimator
- Bayes and empirical Bayes methods for data analysis.
- Bayes and empirical-Bayes multiplicity adjustment in the variable-selection problem
- Bayesian Inference for Logistic Models Using Pólya–Gamma Latent Variables
- Calibration and empirical Bayes variable selection
- Coupling a stochastic approximation version of EM with an MCMC procedure
- Data Analysis Using Stein's Estimator and its Generalizations
- EMVS: the EM approach to Bayesian variable selection
- Empirical Bayes Gibbs sampling
- Empirical Bayes estimates for large-scale prediction problems
- Empirical Bayes prediction intervals in a normal regression model: higher order asymptotics.
- Empirical Bayesian Estimators for a Poisson Process Propagated in Time
- Empirical and fully Bayesian approaches for random effects models in microarray data analysis
- Gene network reconstruction using global-local shrinkage priors
- High-dimensional classification via nonparametric empirical Bayes and maximum likelihood inference
- IPF-LASSO: integrative \(L_1\)-penalized regression with penalty factors for prediction based on multi-omics data
- Incorporating biological information into linear models: a Bayesian approach to the selection of pathways and genes
- Laplace approximation in high-dimensional Bayesian regression
- Large-scale inference. Empirical Bayes methods for estimation, testing, and prediction
- Marginal maximum likelihood estimation methods for the tuning parameters of ridge, power ridge, and generalized ridge regression
- Maximizing Generalized Linear Mixed Model Likelihoods With an Automated Monte Carlo EM Algorithm
- Needles and straw in a haystack: posterior concentration for possibly sparse sequences
- Needles and straw in haystacks: Empirical Bayes estimates of possibly sparse sequences
- On high-dimensional misspecified mixed model analysis in genome-wide association study
- Parametric Empirical Bayes Inference: Theory and Applications
- Regularization and Variable Selection Via the Elastic Net
- Ridge Estimators in Logistic Regression
- Ridge regression:some simulations
- Sampling-Based Approaches to Calculating Marginal Densities
- Scalable variational inference for Bayesian variable selection in regression, and its accuracy in genetic association studies
- Some theory for Fisher's linear discriminant function, `naive Bayes', and some alternatives when there are many more variables than observations
- Spike and slab variable selection: frequentist and Bayesian strategies
- The Bayesian Lasso
- The Bayesian elastic net
- The Group Lasso for Logistic Regression
- The role of empirical Bayes methodology as a leading principle in modern medical statistics
- Weighted lasso with data integration
Cited in
(8)- Flexible co-data learning for high-dimensional prediction
- Scalable multiple network inference with the joint graphical horseshoe
- A review of uncertainty quantification for density estimation
- Fast marginal likelihood estimation of penalties for group-adaptive elastic net
- Empirical Bayes estimates for large-scale prediction problems
- Drug sensitivity prediction with normal inverse Gaussian shrinkage informed by external data
- A global-local approach for detecting hotspots in multiple-response regression
- Empirical priors for prediction in sparse high-dimensional linear regression
This page was built for publication: Learning from a lot: empirical Bayes for high-dimensional model-based prediction
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4629271)