Global solutions to nonconvex problems by evolution of Hamilton-Jacobi PDEs
From MaRDI portal
Publication:6575280
DOI10.1007/S42967-022-00239-5zbMATH Open1543.65092MaRDI QIDQ6575280FDOQ6575280
Authors: Howard Heaton, Samy Wu Fung, Stanley Osher
Publication date: 19 July 2024
Published in: Communications on Applied Mathematics and Computation (Search for Journal in Brave)
Numerical optimization and variational techniques (65K10) Nonlinear programming (90C30) Numerical methods for partial differential equations, initial value and time-dependent initial-boundary value problems (65M99)
Cites Work
- Optimization by simulated annealing
- Differential evolution -- a simple and efficient heuristic for global optimization over continuous spaces
- First-Order Methods in Optimization
- Fronts propagating with curvature-dependent speed: Algorithms based on Hamilton-Jacobi formulations
- Convex Analysis
- Lipschitzian optimization without the Lipschitz constant
- Weighted essentially non-oscillatory schemes
- Differential properties of the Moreau envelope
- Two approximations of solutions of Hamilton-Jacobi equations
- Title not available (Why is that?)
- Title not available (Why is that?)
- Proximité et dualité dans un espace hilbertien
- Algorithm 909
- Benchmarking Derivative-Free Optimization Algorithms
- Convex analysis and monotone operator theory in Hilbert spaces
- High-Order Essentially Nonoscillatory Schemes for Hamilton–Jacobi Equations
- The Elements of Statistical Learning
- Title not available (Why is that?)
- Envelopes and nonconvex Hamilton-Jacobi equations
- Derivative-Free Optimization of Noisy Functions via Quasi-Newton Methods
- Deep relaxation: partial differential equations for optimizing deep neural networks
- Derivative-free optimization methods
- Entropy-SGD: biasing gradient descent into wide valleys
- Subgradient methods for sharp weakly convex functions
- Stochastic Model-Based Minimization of Weakly Convex Functions
- A stochastic subspace approach to gradient-free optimization in high dimensions
- Zeroth-order optimization with orthogonal random directions
- A one-bit, comparison-based gradient estimator
- Zeroth-Order Regularized Optimization (ZORO): Approximately Sparse Gradients and Adaptive Sampling
- On the numerical performance of finite-difference-based methods for derivative-free optimization
- Adaptive Finite-Difference Interval Estimation for Noisy Derivative-Free Optimization
- Escaping Strict Saddle Points of the Moreau Envelope in Nonsmooth Optimization
This page was built for publication: Global solutions to nonconvex problems by evolution of Hamilton-Jacobi PDEs
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6575280)