Deep neural network structures solving variational inequalities
From MaRDI portal
Publication:2194605
DOI10.1007/s11228-019-00526-zzbMath1448.49014arXiv1808.07526OpenAlexW3006592723MaRDI QIDQ2194605
Jean-Christophe Pesquet, Patrick L. Combettes
Publication date: 4 September 2020
Published in: Set-Valued and Variational Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1808.07526
variational inequalitymonotone operatorproximity operatornonexpansive operatoraveraged operatordeep neural network
Related Items (23)
Variational models for signal processing with graph neural networks ⋮ Regularization theory of the analytic deep prior approach ⋮ Deep solution operators for variational inequalities via proximal neural networks ⋮ Wasserstein-Based Projections with Applications to Inverse Problems ⋮ Designing rotationally invariant neural networks from PDEs and variational methods ⋮ Multivariate Monotone Inclusions in Saddle Form ⋮ Analysis of two versions of relaxed inertial algorithms with Bregman divergences for solving variational inequalities ⋮ Connections between numerical algorithms for PDEs and neural networks ⋮ Resolvent and proximal compositions ⋮ Convergence Results for Primal-Dual Algorithms in the Presence of Adjoint Mismatch ⋮ The use of physics-informed neural network approach to image restoration via nonlinear PDE tools ⋮ Convolutional proximal neural networks and plug-and-play algorithms ⋮ Deep neural networks motivated by partial differential equations ⋮ Reconstruction of functions from prescribed proximal points ⋮ Data-Driven Nonsmooth Optimization ⋮ On \(\alpha\)-firmly nonexpansive operators in \(r\)-uniformly convex spaces ⋮ Synthesis of recurrent neural dynamics for monotone inclusion with application to Bayesian inference ⋮ Frame soft shrinkage operators are proximity operators ⋮ Convergence of proximal gradient algorithm in the presence of adjoint mismatch * ⋮ Attouch--Théra Duality, Generalized Cycles, and Gap Vectors ⋮ Learning Maximally Monotone Operators for Image Recovery ⋮ A Variational Inequality Model for the Construction of Signals from Inconsistent Nonlinear Equations ⋮ Lipschitz Certificates for Layered Network Structures Driven by Averaged Activation Operators
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A primal-dual splitting method for convex optimization involving Lipschitzian, proximable and linear composite terms
- Linear and strong convergence of algorithms involving averaged nonexpansive operators
- Iterative methods for fixed point problems in Hilbert spaces
- Compositions and convex combinations of averaged nonexpansive operators
- A dynamical system associated with the fixed points set of a nonexpansive operator
- There is no variational characterization of the cycles in the method of periodic projections
- Convergence properties of dynamic string-averaging projection methods in the presence of perturbations
- A new projection method for finding the closest point in the intersection of convex sets
- On the Douglas-Rachford splitting method and the proximal point algorithm for maximal monotone operators
- Sharp convergence rates for averaged nonexpansive maps
- Monotone operator theory in convex optimization
- The forward-backward algorithm and the normal problem
- Backward-forward algorithms for structured monotone inclusions in Hilbert spaces
- New Douglas--Rachford Algorithmic Structures and Their Convergence Analyses
- Convergence Rate Analysis for Averaged Fixed Point Iterations in Common Fixed Point Problems
- Krasnoselski-Mann Iterations in Normed Spaces
- Proximal Thresholding Algorithm for Minimization over Orthonormal Bases
- On the Convergence of the Products of Firmly Nonexpansive Mappings
- Monotone Operators and the Proximal Point Algorithm
- Universal approximation bounds for superpositions of a sigmoidal function
- Solving monotone inclusions via compositions of nonexpansive averaged operators
- On Projection Algorithms for Solving Convex Feasibility Problems
- Nonexpansiveness of a linearized augmented Lagrangian operator for hierarchical convex optimization
- Signal Recovery by Proximal Forward-Backward Splitting
- Convex Analysis
- A logical calculus of the ideas immanent in nervous activity
- Convex analysis and monotone operator theory in Hilbert spaces
- Approximation by superpositions of a sigmoidal function
This page was built for publication: Deep neural network structures solving variational inequalities