Performance analysis of greedy algorithms for minimising a maximum mean discrepancy
From MaRDI portal
Publication:2104022
DOI10.1007/S11222-022-10184-1zbMATH Open1499.62029arXiv2101.07564OpenAlexW3124611209MaRDI QIDQ2104022FDOQ2104022
Publication date: 9 December 2022
Published in: Statistics and Computing (Search for Journal in Brave)
Abstract: We analyse the performance of several iterative algorithms for the quantisation of a probability measure , based on the minimisation of a Maximum Mean Discrepancy (MMD). Our analysis includes kernel herding, greedy MMD minimisation and Sequential Bayesian Quadrature (SBQ). We show that the finite-sample-size approximation error, measured by the MMD, decreases as for SBQ and also for kernel herding and greedy MMD minimisation when using a suitable step-size sequence. The upper bound on the approximation error is slightly better for SBQ, but the other methods are significantly faster, with a computational cost that increases only linearly with the number of points selected. This is illustrated by two numerical examples, with the target measure being uniform (a space-filling design application) and with a Gaussian mixture. They suggest that the bounds derived in the paper are overly pessimistic, in particular for SBQ. The sources of this pessimism are identified but seem difficult to counter.
Full work available at URL: https://arxiv.org/abs/2101.07564
Recommendations
computer experimentsgreedy algorithmspace-filling designmaximum mean discrepancyquantisationkernel herdingsequential Bayesian quadrature
Cites Work
- Equivalence of distance-based and RKHS-based statistics in hypothesis testing
- Approximation Theorems of Mathematical Statistics
- Probabilistic integration: a role in statistical computation?
- Hilbert space embeddings and metrics on probability measures
- Maximum projection designs for computer experiments
- Energy statistics: a class of statistics based on distances
- The Sequential Generation of $D$-Optimum Experimental Designs
- Title not available (Why is that?)
- Foundations of quantization for probability distributions
- Conditional gradient algorithms with open loop step size rules
- Coresets, sparse greedy approximation, and the Frank-Wolfe algorithm
- A generalized discrepancy and quadrature error bound
- Coordinate descent algorithms
- Finding the nearest point in A polytope
- Control Functionals for Monte Carlo Integration
- On energy, discrepancy and group invariant measures on measurable subsets of Euclidean space
- An asymptotically optimal gradient algorithm for quadratic optimization with low computational cost
- Design of computer experiments: space filling and beyond
- On Khachiyan's algorithm for the computation of minimum-volume enclosing ellipsoids
- Sequences converging to D-optimal designs of experiments
- Linear convergence of a modified Frank–Wolfe algorithm for computing minimum-volume enclosing ellipsoids
- Design of experiments in nonlinear models. Asymptotic normality, optimality criteria and small-sample properties
- Title not available (Why is that?)
- Convergence Rates for Conditional Gradient Sequences Generated by Implicit Step Length Rules
- Measuring sample quality with diffusions
- Support points
- Title not available (Why is that?)
- Minimum-energy measures for singular kernels
- On the positivity and magnitudes of Bayesian quadrature weights
- Bayesian Quadrature, Energy Minimization, and Space-Filling Design
Cited In (1)
Uses Software
This page was built for publication: Performance analysis of greedy algorithms for minimising a maximum mean discrepancy
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2104022)