A geometric integration approach to nonsmooth, nonconvex optimisation
From MaRDI portal
Publication:2088134
Clarke subdifferentialgeometric numerical integrationdiscrete gradient methodsbilevel optimisationderivative-free optimisationnonsmooth optimisationnonconvex optimisation
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26) Stochastic programming (90C15) Image processing (compression, reconstruction, etc.) in information and communication theory (94A08) Derivative-free methods and methods using generalized derivatives (90C56)
Abstract: The optimisation of nonsmooth, nonconvex functions without access to gradients is a particularly challenging problem that is frequently encountered, for example in model parameter optimisation problems. Bilevel optimisation of parameters is a standard setting in areas such as variational regularisation problems and supervised machine learning. We present efficient and robust derivative-free methods called randomised Itoh--Abe methods. These are generalisations of the Itoh--Abe discrete gradient method, a well-known scheme from geometric integration, which has previously only been considered in the smooth setting. We demonstrate that the method and its favourable energy dissipation properties are well-defined in the nonsmooth setting. Furthermore, we prove that whenever the objective function is locally Lipschitz continuous, the iterates almost surely converge to a connected set of Clarke stationary points. We present an implementation of the methods, and apply it to various test problems. The numerical results indicate that the randomised Itoh--Abe methods are superior to state-of-the-art derivative-free optimisation methods in solving nonsmooth problems while remaining competitive in terms of efficiency.
Recommendations
- Limited memory discrete gradient bundle method for nonsmooth derivative-free optimization
- Discrete gradient method: Derivative-free method for nonsmooth optimization
- Manifold sampling for optimizing nonsmooth nonconvex compositions
- A Stochastic Subgradient Method for Nonsmooth Nonconvex Multilevel Composition Optimization
- Aggregate subgradient smoothing methods for large scale nonsmooth nonconvex optimisation and applications
Cites work
- scientific article; zbMATH DE number 4164577 (Why is no real title available?)
- scientific article; zbMATH DE number 3905307 (Why is no real title available?)
- scientific article; zbMATH DE number 46303 (Why is no real title available?)
- scientific article; zbMATH DE number 3539473 (Why is no real title available?)
- scientific article; zbMATH DE number 6973983 (Why is no real title available?)
- scientific article; zbMATH DE number 1376935 (Why is no real title available?)
- scientific article; zbMATH DE number 1421084 (Why is no real title available?)
- A Linesearch-Based Derivative-Free Approach for Nonsmooth Constrained Optimization
- A Simplex Method for Function Minimization
- A TGV-based framework for variational image decompression, zooming, and reconstruction. I: Analytics
- A bilevel optimization approach for parameter learning in variational models
- A first-order primal-dual algorithm for convex problems with applications to imaging
- A hybrid extended pattern search/genetic algorithm for multi-stage wind farm optimization
- A nonderivative version of the gradient sampling algorithm for nonsmooth nonconvex optimization
- A proximal bundle method based on approximate subgradients
- A quasi-Newton algorithm for nonconvex, nonsmooth optimization with global convergence guarantees
- A survey of subdifferential calculus with applications
- Algorithm 909: NOMAD: nonlinear optimization with the MADS algorithm
- An adaptive gradient sampling algorithm for non-smooth optimization
- Bilevel optimization for calibrating point spread functions in blind deconvolution
- Bilevel optimization with nonsmooth lower level problems
- Bilevel parameter learning for higher-order total variation regularisation models
- Bregman Itoh-Abe methods for sparse optimisation
- Derivative-free and blackbox optimization
- Derivative-free optimization methods
- Discrete gradient method: Derivative-free method for nonsmooth optimization
- Discrete gradient methods for solving ODEs numerically while preserving a first integral
- Discrete gradient methods for solving variational image regularisation models
- Discrete gradients for computational Bayesian inference
- Dissipative numerical schemes on Riemannian manifolds with applications to gradient flows
- Evaluating Derivatives
- Generalized convexity of functions and generalized monotonicity of set-valued maps
- Geometric Numerical Integration
- Geometric integration using discrete gradients
- Hamiltonian-conserving discrete canonical equations based on variational difference quotients
- Improving the flexibility and robustness of model-based derivative-free optimization solvers
- Inexact derivative-free optimization for bilevel learning
- Level set and PDE based reconstruction methods in imaging. Lecture notes given at the CIME summer school, Cetraro, Italy, September 2008
- Mesh Adaptive Direct Search Algorithms for Constrained Optimization
- Methods of descent for nondifferentiable optimization
- Modern regularization methods for inverse problems
- Nonlinear total variation based noise removal algorithms
- Nonsmooth optimization via quasi-Newton methods
- On Nesterov's nonsmooth Chebyshev-Rosenbrock functions
- On the equivalence between SOR-type methods for linear systems and the discrete gradient methods for gradient systems
- Optimizing an empirical scoring function for transmembrane protein structure determination
- Parallelized hybrid optimization methods for nonsmooth problems using NOMAD and linesearch
- Proximal Alternating Minimization and Projection Methods for Nonconvex Problems: An Approach Based on the Kurdyka-Łojasiewicz Inequality
- Random gradient-free minimization of convex functions
- Six lectures on the geometric integration of ODEs
- Strongly Regular Generalized Equations
- Subdifferential properties of quasiconvex and pseudoconvex functions: Unified approach
- The NEWUOA software for unconstrained optimization without derivatives
- Time integration and discrete Hamiltonian systems
- Total generalized variation
- Variational image regularization with Euler's elastica using a discrete gradient scheme
Cited in
(5)- Manifold sampling for optimizing nonsmooth nonconvex compositions
- Variational image regularization with Euler's elastica using a discrete gradient scheme
- scientific article; zbMATH DE number 3922503 (Why is no real title available?)
- Existence results on Lagrange multiplier approach for gradient flows and application to optimization
- Discrete gradients in short-range molecular dynamics simulations
This page was built for publication: A geometric integration approach to nonsmooth, nonconvex optimisation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2088134)