Ensemble-Based Gradient Inference for Particle Methods in Optimization and Sampling
DOI10.1137/22m1533281zbMath1518.65142arXiv2209.15420MaRDI QIDQ6177924
Philipp Wacker, Claudia Schillings, Claudia Totzeck
Publication date: 31 August 2023
Published in: SIAM/ASA Journal on Uncertainty Quantification (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2209.15420
Probabilistic methods, particle methods, etc. for boundary value problems involving PDEs (65N75) Bayesian inference (62F15) Nonconvex programming, global optimization (90C26) Derivative-free methods and methods using generalized derivatives (90C56) Stochastic ordinary differential equations (aspects of stochastic analysis) (60H10) Dynamical systems in optimization and economics (37N40) Vlasov equations (35Q83)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Gradient and diagonal Hessian approximations using quadratic interpolation models and aligned regular bases
- Exponential convergence of Langevin distributions and their discrete approximations
- A note on the mean-field limit for the particle swarm optimization
- Inexact derivative-free optimization for bilevel learning
- Consensus-based global optimization with personal best
- Iterative ensemble Kalman methods: a unified perspective with some new variants
- Optimal scaling of random-walk Metropolis algorithms on general target distributions
- A derivative-free Gauss-Newton method
- The calculus of simplex gradients
- Efficient calculation of regular simplex gradients
- Geometry of interpolation sets in derivative free optimization
- On the Lagrange functions of quadratic models that are defined by interpolation*
- A consensus-based model for global optimization and its mean-field limit
- Locally Weighted Regression: An Approach to Regression Analysis by Local Fitting
- Consensus-based optimization on hypersurfaces: Well-posedness and mean-field limit
- Using simplex gradients of nonsmooth functions in direct search methods
- Introduction to Derivative-Free Optimization
- Robust Locally Weighted Regression and Smoothing Scatterplots
- The Variational Formulation of the Fokker--Planck Equation
- An analytical framework for consensus-based global optimization method
- Affine Invariant Interacting Langevin Dynamics for Bayesian Inference
- A consensus-based global optimization method for high dimensional machine learning problems
- Trends in Consensus-Based Optimization
- Stochastic consensus dynamics for nonconvex optimization on the Stiefel manifold: Mean-field limit and convergence
- Interacting Langevin Diffusions: Gradient Structure and Ensemble Kalman Sampler
- From particle swarm optimization to consensus based optimization: Stochastic modeling and mean-field limit
- Well posedness and convergence analysis of the ensemble Kalman inversion
- Analysis of the Ensemble Kalman Filter for Inverse Problems
- Consensus-based optimization via jump-diffusion stochastic differential equations
- An adaptive consensus based method for multi-objective optimization with uniform Pareto front approximation
This page was built for publication: Ensemble-Based Gradient Inference for Particle Methods in Optimization and Sampling