A selective linearization method for multiblock convex optimization
From MaRDI portal
Publication:5266536
Abstract: We consider the problem of minimizing a sum of several convex non-smooth functions. We introduce a new algorithm called the selective linearization method, which iteratively linearizes all but one of the functions and employs simple proximal steps. The algorithm is a form of multiple operator splitting in which the order of processing partial functions is not fixed, but rather determined in the course of calculations. Global convergence is proved and estimates of the convergence rate are derived. Specifically, the number of iterations needed to achieve solution accuracy is of order . We also illustrate the operation of the algorithm on structured regularization problems.
Recommendations
- Selective linearization for multi-block statistical learning
- A class of alternating linearization algorithms for nonsmooth convex optimization
- A splitting method for separable convex programming
- Proximal Decomposition Via Alternating Linearization
- Inexact alternating-direction-based contraction methods for separable linearly constrained convex optimization
Cites work
- scientific article; zbMATH DE number 6378171 (Why is no real title available?)
- scientific article; zbMATH DE number 45081 (Why is no real title available?)
- scientific article; zbMATH DE number 3574917 (Why is no real title available?)
- scientific article; zbMATH DE number 477581 (Why is no real title available?)
- A dual algorithm for the solution of nonlinear variational problems via finite element approximation
- A regularized decomposition method for minimizing a sum of polyhedral functions
- A splitting algorithm for dual monotone inclusions involving cocoercive operators
- Convex analysis and monotone operator theory in Hilbert spaces
- Distributed optimization and statistical learning via the alternating direction method of multipliers
- Iterative construction of the resolvent of a sum of maximal monotone operators
- Methods of descent for nondifferentiable optimization
- Monotone Operators and the Proximal Point Algorithm
- Nonlinear optimization.
- On the Douglas-Rachford splitting method and the proximal point algorithm for maximal monotone operators
- On the Numerical Solution of Heat Conduction Problems in Two and Three Space Variables
- Primal-dual splitting algorithm for solving inclusions with mixtures of composite, Lipschitzian, and parallel-sum type monotone operators
- Proximal Decomposition Via Alternating Linearization
- Proximal splitting methods in signal processing
- Scenarios and Policy Aggregation in Optimization Under Uncertainty
- Split Bregman method for large scale fused Lasso
- Splitting Algorithms for the Sum of Two Nonlinear Operators
- The Numerical Solution of Parabolic and Elliptic Differential Equations
- The Split Bregman Method for L1-Regularized Problems
Cited in
(6)- Proximal alternating penalty algorithms for nonsmooth constrained convex optimization
- An Efficient Algorithm for Minimizing Multi Non-Smooth Component Functions
- An adaptive partial linearization method for optimization problems on product sets
- An ADMM algorithm for two-stage stochastic programming problems
- Selective linearization for multi-block statistical learning
- An outer-inner linearization method for non-convex and nondifferentiable composite regularization problems
This page was built for publication: A selective linearization method for multiblock convex optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5266536)