Large-scale convex optimization. Algorithms \& analyses via monotone operators
From MaRDI portal
Publication:5097614
DOI10.1017/9781009160865OpenAlexW4311114086MaRDI QIDQ5097614FDOQ5097614
Authors: Ernest K. Ryu, Wotao Yin
Publication date: 25 August 2022
Full work available at URL: https://doi.org/10.1017/9781009160865
Recommendations
Cited In (13)
- Convex optimization: algorithms and complexity
- First-order methods in optimization
- Learning to optimize: a tutorial for continuous and mixed-integer optimization
- Port-Hamiltonian Systems Theory and Monotonicity
- Optimal error bounds for non-expansive fixed-point iterations in normed spaces
- An Efficient and Robust Scalar Auxialiary Variable Based Algorithm for Discrete Gradient Systems Arising from Optimizations
- Maximal Monotonicity and Cyclic Involutivity of Multiconjugate Convex Functions
- Factor-\(\sqrt{2}\) acceleration of accelerated gradient methods
- Splitting algorithms, modern operator theory, and applications. Based on the workshop on splitting algorithms, modern operator theory, and applications, Oaxaca, Mexico, September 17--22, 2017. Dedicated to the memory of Jonathan M. Borwein
- Algorithms with gradient clipping for stochastic optimization with heavy-tailed noise
- Algorithms for convex optimization
- A primer on monotone operator methods
- A generalized forward-backward splitting operator: degenerate analysis and applications
This page was built for publication: Large-scale convex optimization. Algorithms \& analyses via monotone operators
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5097614)