Projective Integral Updates for High-Dimensional Variational Inference
From MaRDI portal
Publication:6131419
Abstract: This work proposes a quasirandom sequence of quadratures for high-dimensional mean-field variational inference and a related sparsifying methodology. Each iterate of the sequence contains two evaluations points that combine to correctly integrate all univariate quadratic functions, as well as univariate cubics if the mean-field factors are symmetric. More importantly, averaging results over short subsequences achieves periodic exactness on a much larger space of multivariate polynomials of quadratic total degree. This framework is devised by first considering stochastic blocked mean-field quadratures, which may be useful in other contexts. By replacing pseudorandom sequences with quasirandom sequences, over half of all multivariate quadratic basis functions integrate exactly with only 4 function evaluations, and the exactness dimension increases for longer subsequences. Analysis shows how these efficient integrals characterize the dominant log-posterior contributions to mean-field variational approximations, including diagonal Hessian approximations, to support a robust sparsifying methodology in deep learning algorithms. A numerical demonstration of this approach on a simple Convolutional Neural Network for MNIST retains high test accuracy, 96.9%, while training over 98.9% of parameters to zero in only 10 epochs, bearing potential to reduce both storage and energy requirements for deep learning models.
Recommendations
Cites work
- scientific article; zbMATH DE number 4074520 (Why is no real title available?)
- scientific article; zbMATH DE number 1215244 (Why is no real title available?)
- scientific article; zbMATH DE number 1273988 (Why is no real title available?)
- scientific article; zbMATH DE number 3321507 (Why is no real title available?)
- A Stochastic Approximation Method
- A Systematization of the Unscented Kalman Filter Theory
- Adaptive subgradient methods for online learning and stochastic optimization
- Algorithms for Numerical Analysis in High Dimensions
- An estimator for the diagonal of a matrix
- BAYESIAN INFERENCE OF STOCHASTIC REACTION NETWORKS USING MULTIFIDELITY SEQUENTIAL TEMPERED MARKOV CHAIN MONTE CARLO
- Construction of fully symmetric numerical integration formulas
- Cubature, approximation, and isotropy in the hypercube
- Exactness of quadrature formulas
- Generalized parallel tempering on Bayesian inverse problems
- High-dimensional integration: The quasi-Monte Carlo way
- Numerical integration using sparse grids
- On Information and Sufficiency
- Pattern recognition and machine learning.
- Simple cubature formulas with high polynomial exactness
- Smolyak cubature of given polynomial degree with few nodes for increasing dimension
This page was built for publication: Projective Integral Updates for High-Dimensional Variational Inference
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6131419)