Random projections for Bayesian regression
From MaRDI portal
Abstract: This article deals with random projections applied as a data reduction technique for Bayesian regression analysis. We show sufficient conditions under which the entire -dimensional distribution is approximately preserved under random projections by reducing the number of data points from to in the case . Under mild assumptions, we prove that evaluating a Gaussian likelihood function based on the projected data instead of the original data yields a -approximation in terms of the Wasserstein distance. Our main result shows that the posterior distribution of Bayesian linear regression is approximated up to a small error depending on only an -fraction of its defining parameters. This holds when using arbitrary Gaussian priors or the degenerate case of uniform distributions over for . Our empirical evaluations involve different simulated settings of Bayesian linear regression. Our experiments underline that the proposed method is able to recover the regression model up to small error while considerably reducing the total running time.
Recommendations
- Bayesian inference via projections
- Random Projections for Large-Scale Regression
- The regression estimation with random projection
- scientific article; zbMATH DE number 6276198
- Bayesian Inference with Projected Densities
- Bayesian estimation for nonparametric regression
- scientific article; zbMATH DE number 3907571
- Nonparametric Bayesian regression
- Bayesian shrinkage prediction for the regression problem
- scientific article; zbMATH DE number 2143289
Cites work
- scientific article; zbMATH DE number 47926 (Why is no real title available?)
- scientific article; zbMATH DE number 852525 (Why is no real title available?)
- scientific article; zbMATH DE number 6159604 (Why is no real title available?)
- A Reliable Randomized Algorithm for the Closest-Pair Problem
- A class of Wasserstein metrics for probability distributions
- A one-pass sequential Monte Carlo method for Bayesian analysis of massive datasets
- A simple proof of the restricted isometry property for random matrices
- A statistical perspective on algorithmic leveraging
- Approximate Bayesian inference for latent Gaussian models by using integrated nested Laplace approximations (with discussion)
- Bayesian compressed regression
- Bayesian computing with INLA: new features
- Bayesian data analysis.
- Communication-optimal parallel and sequential QR and LU factorizations
- Compressed sensing
- Data Streams: Algorithms and Applications
- Dimensionality reduction for \(k\)-means clustering and low rank approximation
- Efficient Gaussian process regression for large datasets
- Fast dimension reduction using Rademacher series on dual BCH codes
- Faster least squares approximation
- Finding structure with randomness: probabilistic algorithms for constructing approximate matrix decompositions
- Improved matrix algorithms via the subsampled randomized Hadamard transform
- Likelihood-based data squashing: A modeling approach to instance construction
- Lower bounds for oblivious subspace embeddings
- Near-Optimal Coresets for Least-Squares Regression
- Numerical linear algebra in the streaming model
- Numerical methods for solving linear least squares problems
- Optimal Transport
- Pattern recognition and machine learning.
- Principal component analysis.
- Prior distributions for variance parameters in hierarchical models (Comment on article by Browne and Draper)
- Random projections for Bayesian regression
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Sampling Algorithms and Coresets for $\ell_p$ Regression
- Sampling algorithms for \(l_2\) regression and applications
- Sketching for M-Estimators: A Unified Approach to Robust Regression
- Sparsity lower bounds for dimensionality reducing maps
- Spectral algorithms
- Speeding Up MCMC by Efficient Data Subsampling
- The Johnson-Lindenstrauss Transform: An Empirical Study
- The no-U-turn sampler: adaptively setting path lengths in Hamiltonian Monte Carlo
- The space complexity of approximating the frequency moments
- Turning big data into tiny data: constant-size coresets for \(k\)-means, PCA and projective clustering
- Twice-Ramanujan sparsifiers
Cited in
(10)- scientific article; zbMATH DE number 6276198 (Why is no real title available?)
- Bayesian compressed regression
- On randomized sketching algorithms and the Tracy-Widom law
- Optimal projection of observations in a Bayesian setting
- Stability of random-projection based classifiers. The Bayes error perspective
- Random projections for Bayesian regression
- Beta Random Projection
- Improving Random Projections Using Marginal Information
- RaProR
- Automated scalable Bayesian inference via Hilbert coresets
This page was built for publication: Random projections for Bayesian regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q144017)