Primal-dual block-proximal splitting for a class of non-convex problems
From MaRDI portal
Publication:2218923
DOI10.1553/etna_vol52s509zbMath1457.90156arXiv1911.06284OpenAlexW2985873675MaRDI QIDQ2218923
Stanislav Mazurenko, Jyrki Jauhiainen, Tuomo Valkonen
Publication date: 18 January 2021
Published in: ETNA. Electronic Transactions on Numerical Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1911.06284
Nonlinear programming (90C30) Numerical optimization and variational techniques (65K10) Programming in abstract spaces (90C48)
Related Items (2)
An alternative extrapolation scheme of PDHGM for saddle point problem with nonlinear function ⋮ BlockPDPS.jl
Uses Software
Cites Work
- Parallel coordinate descent methods for big data optimization
- An algorithm for total variation minimization and applications
- A globally convergent algorithm for nonconvex optimization based on block coordinate update
- A first-order primal-dual algorithm for convex problems with applications to imaging
- Testing and non-linear preconditioning of the proximal point method
- Primal-dual proximal splitting and generalized conjugation in non-smooth non-convex optimization
- Block-proximal methods with spatially adapted acceleration
- Coordinate descent algorithms
- Iterative Hessian sketch: Fast and accurate solution approximation for constrained least-squares
- Distributed Coordinate Descent Method for Learning with Big Data
- ARock: An Algorithmic Framework for Asynchronous Parallel Coordinate Updates
- TGV for diffusion tensors: A comparison of fidelity functions
- Total Generalized Variation in Diffusion Tensor Imaging
- A Two-Stage Image Segmentation Method for Blurry Images with Poisson or Multiplicative Gamma Noise
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Block Stochastic Gradient Iteration for Convex and Nonconvex Optimization
- Julia: A Fresh Approach to Numerical Computing
- Accelerated, Parallel, and Proximal Coordinate Descent
- Variational Analysis
- Stochastic Model-Based Minimization of Weakly Convex Functions
- Acceleration and Global Convergence of a First-Order Primal-Dual Method for Nonconvex Problems
- Stochastic Primal-Dual Coordinate Method for Regularized Empirical Risk Minimization
- A Coordinate-Descent Primal-Dual Algorithm with Large Step Size and Possibly Nonseparable Functions
- Stochastic Primal-Dual Hybrid Gradient Algorithm with Arbitrary Sampling and Imaging Applications
- A primal–dual hybrid gradient method for nonlinear operators with applications to MRI
- Relaxed Gauss--Newton Methods with Applications to Electrical Impedance Tomography
- A Stochastic Semismooth Newton Method for Nonsmooth Nonconvex Optimization
- Primal-Dual Extragradient Methods for Nonlinear Nonsmooth PDE-Constrained Optimization
- Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
This page was built for publication: Primal-dual block-proximal splitting for a class of non-convex problems