Differentiating Nonsmooth Solutions to Parametric Monotone Inclusion Problems
DOI10.1137/22M1541630zbMATH Open1530.49026arXiv2212.07844OpenAlexW4311599212MaRDI QIDQ6136656FDOQ6136656
Authors: Jérôme Bolte, Edouard Pauwels, Antonio Silveti-Falls
Publication date: 17 January 2024
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2212.07844
Recommendations
- Sensitivity analysis of maximally monotone inclusions via the proto-differentiability of the resolvent operator
- Uniqueness and differentiability of solutions of parametric nonlinear complementarity problems
- Path differentiability of ODE flows
- scientific article; zbMATH DE number 1099074
- Conservative set valued fields, automatic differentiation, stochastic gradient methods and deep learning
generalized equationmaximal monotone operatormonotone inclusionClarke subdifferentialgeneralized gradientimplicit differentiationconservative fielddifferentiating solutions
Nonsmooth analysis (49J52) Set-valued and variational analysis (49J53) Optimality conditions for minimax problems (49K35) Sensitivity, stability, well-posedness (49K40)
Cites Work
- Splitting Algorithms for the Sum of Two Nonlinear Operators
- Variational Analysis
- Convex analysis and monotone operator theory in Hilbert spaces
- First-order methods in optimization
- Title not available (Why is that?)
- Solving monotone inclusions via compositions of nonexpansive averaged operators
- Signal Recovery by Proximal Forward-Backward Splitting
- A first-order primal-dual algorithm for convex problems with applications to imaging
- Clarke Subgradients of Stratifiable Functions
- Strongly Regular Generalized Equations
- Necessary and sufficient optimality conditions for mathematical programs with equilibrium constraints
- Error bounds in mathematical programming
- On the maximal monotonicity of subdifferential mappings
- An inertial forward-backward-forward primal-dual splitting algorithm for solving monotone inclusion problems
- Inertial Douglas-Rachford splitting for monotone inclusion problems
- A primal-dual splitting method for convex optimization involving Lipschitzian, proximable and linear composite terms
- Generalized equations and their solutions, Part I: Basic theory
- Tangent Cones, Generalized Gradients and Mathematical Programming in Banach Spaces
- Proto-differentiability of set-valued mappings and its applications in optimization
- Sensitivity Analysis of Solutions to Generalized Equations
- A forward-backward splitting method for monotone inclusions without cocoercivity
- The Strong Second-Order Sufficient Condition and Constraint Nondegeneracy in Nonlinear Semidefinite Programming and Their Implications
- Finding best approximation pairs relative to two closed convex sets in Hilbert spaces
- Clarke generalized Jacobian of the projection onto the cone of positive semidefinite matrices
- Sensitivity analysis of generalized equations
- Fifty years of maximal monotonicity
- On the ergodic convergence rates of a first-order primal-dual algorithm
- Lagrangian Duality and Related Multiplier Methods for Variational Inequality Problems
- Title not available (Why is that?)
- A stochastic Bregman primal-dual splitting algorithm for composite optimization
- Generalized Hessian Properties of Regularized Nonsmooth Functions
- From error bounds to the complexity of first-order descent methods for convex functions
- Sensitivity analysis for nonsmooth generalized equations
- Title not available (Why is that?)
- Automatic differentiation of iterative processes
- Sensitivity analysis for mirror-stratifiable convex functions
- The degrees of freedom of partly smooth regularizers
- Bilevel optimization with nonsmooth lower level problems
- Local linear convergence analysis of primal-dual splitting methods
- Convergence of the forward-backward algorithm: beyond the worst-case with the help of geometry
- Sparsifying Transform Learning With Efficient Optimal Updates and Convergence Guarantees
- Conservative set valued fields, automatic differentiation, stochastic gradient methods and deep learning
- Perturbation techniques for convergence analysis of proximal gradient method and other first-order algorithms via variational analysis
- Fixed Point Strategies in Data Science
- Learning consistent discretizations of the total variation
- Learning maximally monotone operators for image recovery
- A primal-dual algorithm with line search for general convex-concave saddle point problems
- Accelerated Bregman Primal-Dual Methods Applied to Optimal Transport and Wasserstein Barycenter Problems
- Sensitivity analysis of maximally monotone inclusions via the proto-differentiability of the resolvent operator
- Convergence of a piggyback-style method for the differentiation of solutions of standard saddle-point problems
- LASSO Reloaded: A Variational Analysis Perspective with Applications to Compressed Sensing
Cited In (1)
This page was built for publication: Differentiating Nonsmooth Solutions to Parametric Monotone Inclusion Problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6136656)