An \(O(s^r)\)-resolution ODE framework for understanding discrete-time algorithms and applications to the linear convergence of minimax problems
From MaRDI portal
Publication:2149577
DOI10.1007/s10107-021-01669-4zbMath1502.90137arXiv2001.08826OpenAlexW3173451675MaRDI QIDQ2149577
Publication date: 29 June 2022
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2001.08826
Analysis of algorithms and problem complexity (68Q25) Minimax problems in mathematical programming (90C47) Nonconvex programming, global optimization (90C26)
Related Items (3)
A \(J\)-symmetric quasi-Newton method for minimax problems ⋮ A second order primal-dual dynamical system for a convex-concave bilinear saddle point problem ⋮ Unnamed Item
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Smooth minimization of non-smooth functions
- On the Douglas-Rachford splitting method and the proximal point algorithm for maximal monotone operators
- The statistical mechanics of strategic interaction
- Introductory lectures on convex optimization. A basic course.
- On linear convergence of iterative methods for the variational inequality problem
- Optimization and dynamical systems
- A first-order primal-dual algorithm for convex problems with applications to imaging
- A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
- On the Complexity of the Hybrid Proximal Extragradient Method for the Iterates and the Ergodic Mean
- Theory and Applications of Robust Optimization
- On the Numerical Solution of Heat Conduction Problems in Two and Three Space Variables
- Monotone Operators and the Proximal Point Algorithm
- A dynamical systems approach to constrained minimization
- Stochastic Primal-Dual Coordinate Method for Regularized Empirical Risk Minimization
- A variational perspective on accelerated methods in optimization
- Prox-Method with Rate of Convergence O(1/t) for Variational Inequalities with Lipschitz Continuous Monotone Operators and Smooth Convex-Concave Saddle Point Problems
- Convergence Rate of $\mathcal{O}(1/k)$ for Optimistic Gradient and Extragradient Methods in Smooth Convex-Concave Saddle Point Problems
- Convex analysis and monotone operator theory in Hilbert spaces
This page was built for publication: An \(O(s^r)\)-resolution ODE framework for understanding discrete-time algorithms and applications to the linear convergence of minimax problems