Two steps at a time-taking GAN training in stride with Tseng's method
DOI10.1137/21M1420939zbMATH Open1492.65175arXiv2006.09033OpenAlexW3035573799MaRDI QIDQ5089720FDOQ5089720
Axel Böhm, Ernö Robert Csetnek, Michael Sedlmayer, Radu I. Boţ
Publication date: 15 July 2022
Published in: SIAM Journal on Mathematics of Data Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2006.09033
Recommendations
- Alternating Proximal-Gradient Steps for (Stochastic) Nonconvex-Concave Minimax Problems
- Optimality Conditions for Nonsmooth Nonconvex-Nonconcave Min-Max Problems and Generative Adversarial Networks
- Training GANs with centripetal acceleration
- Efficient second-order optimization with predictions in differential games
- Convergence rate of \(\mathcal{O}(1/k)\) for optimistic gradient and extragradient methods in smooth convex-concave saddle point problems
Stochastic programming (90C15) Minimax problems in mathematical programming (90C47) Numerical methods for variational inequalities and related problems (65K15)
Cites Work
- Nonlinear total variation based noise removal algorithms
- Title not available (Why is that?)
- Convex analysis and monotone operator theory in Hilbert spaces
- A Stochastic Approximation Method
- Prox-Method with Rate of Convergence O(1/t) for Variational Inequalities with Lipschitz Continuous Monotone Operators and Smooth Convex-Concave Saddle Point Problems
- Applications of a Splitting Algorithm to Decomposition in Convex Programming and Variational Inequalities
- Title not available (Why is that?)
- Dual extrapolation and its applications to solving variational inequalities and related problems
- Solving variational inequalities with Stochastic Mirror-Prox algorithm
- Convex Optimization in Signal Processing and Communications
- A Forward-Backward Splitting Method for Monotone Inclusions Without Cocoercivity
- ℓ1 Regularization in Infinite Dimensional Feature Spaces
- Projected Reflected Gradient Methods for Monotone Variational Inequalities
- Shadow Douglas-Rachford splitting for monotone inclusions
- Breaking the Curse of Dimensionality with Convex Neural Networks
- The complexity of constrained min-max optimization
- Extragradient Method with Variance Reduction for Stochastic Variational Inequalities
- Convergence Rate of $\mathcal{O}(1/k)$ for Optimistic Gradient and Extragradient Methods in Smooth Convex-Concave Saddle Point Problems
- Minibatch Forward-Backward-Forward Methods for Solving Stochastic Variational Inequalities
- A Primal-Dual Algorithm with Line Search for General Convex-Concave Saddle Point Problems
Cited In (10)
- Optimality Conditions for Nonsmooth Nonconvex-Nonconcave Min-Max Problems and Generative Adversarial Networks
- An accelerated minimax algorithm for convex-concave saddle point problems with nonsmooth coupling function
- Tseng’s Algorithm with Extrapolation from the past Endowed with Variable Metrics and Error Terms
- Alternating Proximal-Gradient Steps for (Stochastic) Nonconvex-Concave Minimax Problems
- A modified Tseng's algorithm with extrapolation from the past for pseudo-monotone variational inequalities
- A mirror inertial forward-reflected-backward splitting: convergence analysis beyond convexity and Lipschitz smoothness
- Variable sample-size optimistic mirror descent algorithm for stochastic mixed variational inequalities
- Efficient second-order optimization with predictions in differential games
- Fast convergence of the primal-dual dynamical system and corresponding algorithms for a nonsmooth bilinearly coupled saddle point problem
- Variable sample-size operator extrapolation algorithm for stochastic mixed variational inequalities
Uses Software
This page was built for publication: Two steps at a time-taking GAN training in stride with Tseng's method
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5089720)