Primal-dual block-proximal splitting for a class of non-convex problems
From MaRDI portal
Publication:2218923
Abstract: We develop block structure adapted primal-dual algorithms for non-convex non-smooth optimisation problems whose objectives can be written as compositions of non-smooth block-separable convex functions and with a non-linear Lipschitz-differentiable operator . Our methods are refinements of the non-linear primal-dual proximal splitting method for such problems without the block structure, which itself is based on the primal-dual proximal splitting method of Chambolle and Pock for convex problems. We propose individual step length parameters and acceleration rules for each of the primal and dual blocks of the problem. This allows them to convergence faster by adapting to the structure of the problem. For the squared distance of the iterates to a critical point, we show local , and linear rates under varying conditions and choices of the step lengths parameters. Finally, we demonstrate the performance of the methods on practical inverse problems: diffusion tensor imaging and electrical impedance tomography.
Recommendations
- Primal-dual proximal splitting and generalized conjugation in non-smooth non-convex optimization
- Acceleration and global convergence of a first-order primal-dual method for nonconvex problems
- Local linear convergence analysis of primal-dual splitting methods
- The Primal-Dual Hybrid Gradient Method for Semiconvex Splittings
- Block-proximal methods with spatially adapted acceleration
Cites work
- A coordinate-descent primal-dual algorithm with large step size and possibly nonseparable functions
- A first-order primal-dual algorithm for convex problems with applications to imaging
- A globally convergent algorithm for nonconvex optimization based on block coordinate update
- A primal-dual hybrid gradient method for nonlinear operators with applications to MRI
- A stochastic semismooth Newton method for nonsmooth nonconvex optimization
- A two-stage image segmentation method for blurry images with Poisson or multiplicative gamma noise
- ARock: an algorithmic framework for asynchronous parallel coordinate updates
- Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
- Accelerated, parallel, and proximal coordinate descent
- Acceleration and global convergence of a first-order primal-dual method for nonconvex problems
- An algorithm for total variation minimization and applications
- Block stochastic gradient iteration for convex and nonconvex optimization
- Block-proximal methods with spatially adapted acceleration
- Coordinate descent algorithms
- Distributed coordinate descent method for learning with big data
- Efficiency of coordinate descent methods on huge-scale optimization problems
- Iterative Hessian sketch: fast and accurate solution approximation for constrained least-squares
- Julia: a fresh approach to numerical computing
- Parallel coordinate descent methods for big data optimization
- Primal-dual extragradient methods for nonlinear nonsmooth PDE-constrained optimization
- Primal-dual proximal splitting and generalized conjugation in non-smooth non-convex optimization
- Relaxed Gauss-Newton methods with applications to electrical impedance tomography
- Stochastic Primal-Dual Hybrid Gradient Algorithm with Arbitrary Sampling and Imaging Applications
- Stochastic model-based minimization of weakly convex functions
- Stochastic primal-dual coordinate method for regularized empirical risk minimization
- TGV for diffusion tensors: a comparison of fidelity functions
- Testing and non-linear preconditioning of the proximal point method
- Total generalized variation in diffusion tensor imaging
- Variational Analysis
Cited in
(4)
This page was built for publication: Primal-dual block-proximal splitting for a class of non-convex problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2218923)