Differentiating Nonsmooth Solutions to Parametric Monotone Inclusion Problems
From MaRDI portal
Publication:6136656
Abstract: We leverage path differentiability and a recent result on nonsmooth implicit differentiation calculus to give sufficient conditions ensuring that the solution to a monotone inclusion problem will be path differentiable, with formulas for computing its generalized gradient. A direct consequence of our result is that these solutions happen to be differentiable almost everywhere. Our approach is fully compatible with automatic differentiation and comes with assumptions which are easy to check, roughly speaking: semialgebraicity and strong monotonicity. We illustrate the scope of our results by considering three fundamental composite problem settings: strongly convex problems, dual solutions to convex minimization problems and primal-dual solutions to min-max problems.
Recommendations
- Sensitivity analysis of maximally monotone inclusions via the proto-differentiability of the resolvent operator
- Uniqueness and differentiability of solutions of parametric nonlinear complementarity problems
- Path differentiability of ODE flows
- scientific article; zbMATH DE number 1099074
- Conservative set valued fields, automatic differentiation, stochastic gradient methods and deep learning
Cites work
- scientific article; zbMATH DE number 3882752 (Why is no real title available?)
- scientific article; zbMATH DE number 46303 (Why is no real title available?)
- scientific article; zbMATH DE number 2143180 (Why is no real title available?)
- A first-order primal-dual algorithm for convex problems with applications to imaging
- A forward-backward splitting method for monotone inclusions without cocoercivity
- A primal-dual algorithm with line search for general convex-concave saddle point problems
- A primal-dual splitting method for convex optimization involving Lipschitzian, proximable and linear composite terms
- A stochastic Bregman primal-dual splitting algorithm for composite optimization
- Accelerated Bregman Primal-Dual Methods Applied to Optimal Transport and Wasserstein Barycenter Problems
- An inertial forward-backward-forward primal-dual splitting algorithm for solving monotone inclusion problems
- Automatic differentiation of iterative processes
- Bilevel optimization with nonsmooth lower level problems
- Clarke Subgradients of Stratifiable Functions
- Clarke generalized Jacobian of the projection onto the cone of positive semidefinite matrices
- Conservative set valued fields, automatic differentiation, stochastic gradient methods and deep learning
- Convergence of a piggyback-style method for the differentiation of solutions of standard saddle-point problems
- Convergence of the forward-backward algorithm: beyond the worst-case with the help of geometry
- Convex analysis and monotone operator theory in Hilbert spaces
- Error bounds in mathematical programming
- Fifty years of maximal monotonicity
- Finding best approximation pairs relative to two closed convex sets in Hilbert spaces
- First-order methods in optimization
- Fixed Point Strategies in Data Science
- From error bounds to the complexity of first-order descent methods for convex functions
- Generalized Hessian Properties of Regularized Nonsmooth Functions
- Generalized equations and their solutions, Part I: Basic theory
- Inertial Douglas-Rachford splitting for monotone inclusion problems
- LASSO Reloaded: A Variational Analysis Perspective with Applications to Compressed Sensing
- Lagrangian Duality and Related Multiplier Methods for Variational Inequality Problems
- Learning consistent discretizations of the total variation
- Learning maximally monotone operators for image recovery
- Local linear convergence analysis of primal-dual splitting methods
- Necessary and sufficient optimality conditions for mathematical programs with equilibrium constraints
- On the ergodic convergence rates of a first-order primal-dual algorithm
- On the maximal monotonicity of subdifferential mappings
- Perturbation techniques for convergence analysis of proximal gradient method and other first-order algorithms via variational analysis
- Proto-differentiability of set-valued mappings and its applications in optimization
- Sensitivity Analysis of Solutions to Generalized Equations
- Sensitivity analysis for mirror-stratifiable convex functions
- Sensitivity analysis for nonsmooth generalized equations
- Sensitivity analysis of generalized equations
- Sensitivity analysis of maximally monotone inclusions via the proto-differentiability of the resolvent operator
- Signal Recovery by Proximal Forward-Backward Splitting
- Solving monotone inclusions via compositions of nonexpansive averaged operators
- Sparsifying Transform Learning With Efficient Optimal Updates and Convergence Guarantees
- Splitting Algorithms for the Sum of Two Nonlinear Operators
- Strongly Regular Generalized Equations
- Tangent Cones, Generalized Gradients and Mathematical Programming in Banach Spaces
- The Strong Second-Order Sufficient Condition and Constraint Nondegeneracy in Nonlinear Semidefinite Programming and Their Implications
- The degrees of freedom of partly smooth regularizers
- Variational Analysis
This page was built for publication: Differentiating Nonsmooth Solutions to Parametric Monotone Inclusion Problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6136656)