Random projections for Bayesian regression
DOI10.1007/s11222-015-9608-zzbMath1505.62157arXiv1504.06122OpenAlexW2152435742WikidataQ59602664 ScholiaQ59602664MaRDI QIDQ144017
Katja Ickstadt, Alexander Munteanu, Leo N. Geppert, Christian Sohler, Jens Quedenfeld, Alexander Munteanu, Leo N. Geppert, Jens Quedenfeld, Christian Sohler, Katja Ickstadt
Publication date: 19 November 2015
Published in: Statistics and Computing (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1504.06122
Computational methods for problems pertaining to statistics (62-08) Linear regression; mixed models (62J05) Bayesian inference (62F15) Randomized algorithms (68W20)
Related Items (5)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Finding structure with randomness: Probabilistic algorithms for constructing approximate matrix decompositions
- The No-U-Turn Sampler: Adaptively Setting Path Lengths in Hamiltonian Monte Carlo
- Random projections for Bayesian regression
- Faster least squares approximation
- A simple proof of the restricted isometry property for random matrices
- Fast dimension reduction using Rademacher series on dual BCH codes
- A class of Wasserstein metrics for probability distributions
- The space complexity of approximating the frequency moments
- Likelihood-based data squashing: A modeling approach to instance construction
- Bayesian computing with INLA: new features
- Principal component analysis.
- Numerical methods for solving linear least squares problems
- The Fast Cauchy Transform and Faster Robust Linear Regression
- Improved Matrix Algorithms via the Subsampled Randomized Hadamard Transform
- Communication-optimal Parallel and Sequential QR and LU Factorizations
- Approximate Bayesian Inference for Latent Gaussian models by using Integrated Nested Laplace Approximations
- Dimensionality Reduction for k-Means Clustering and Low Rank Approximation
- Low-Rank Approximation and Regression in Input Sparsity Time
- Data Streams: Algorithms and Applications
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Sampling algorithms for l2 regression and applications
- Sampling Algorithms and Coresets for $\ell_p$ Regression
- Spectral Algorithms
- A Reliable Randomized Algorithm for the Closest-Pair Problem
- Twice-Ramanujan Sparsifiers
- Lower Bounds for Oblivious Subspace Embeddings
- Numerical linear algebra in the streaming model
- Speeding Up MCMC by Efficient Data Subsampling
- The Johnson-Lindenstrauss Transform: An Empirical Study
- Near-Optimal Coresets for Least-Squares Regression
- Sketching for M-Estimators: A Unified Approach to Robust Regression
- Bayesian Compressed Regression
- Efficient Gaussian process regression for large datasets
- Sparsity lower bounds for dimensionality reducing maps
- Turning Big data into tiny data: Constant-size coresets for k-means, PCA and projective clustering
- Optimal Transport
- Compressed sensing
- A one-pass sequential Monte Carlo method for Bayesian analysis of massive datasets
- Prior distributions for variance parameters in hierarchical models (Comment on article by Browne and Draper)
This page was built for publication: Random projections for Bayesian regression