A selective linearization method for multiblock convex optimization
From MaRDI portal
Publication:5266536
DOI10.1137/15M103217XzbMATH Open1366.90194arXiv1511.01716MaRDI QIDQ5266536FDOQ5266536
Authors: Yu Du, Xiaodong Lin, Andrzej Ruszczyński
Publication date: 16 June 2017
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Abstract: We consider the problem of minimizing a sum of several convex non-smooth functions. We introduce a new algorithm called the selective linearization method, which iteratively linearizes all but one of the functions and employs simple proximal steps. The algorithm is a form of multiple operator splitting in which the order of processing partial functions is not fixed, but rather determined in the course of calculations. Global convergence is proved and estimates of the convergence rate are derived. Specifically, the number of iterations needed to achieve solution accuracy is of order . We also illustrate the operation of the algorithm on structured regularization problems.
Full work available at URL: https://arxiv.org/abs/1511.01716
Recommendations
- Selective linearization for multi-block statistical learning
- A class of alternating linearization algorithms for nonsmooth convex optimization
- A splitting method for separable convex programming
- Proximal Decomposition Via Alternating Linearization
- Inexact alternating-direction-based contraction methods for separable linearly constrained convex optimization
Signal theory (characterization, reconstruction, filtering, etc.) (94A12) Nonlinear programming (90C30) Decomposition methods (49M27)
Cites Work
- Distributed optimization and statistical learning via the alternating direction method of multipliers
- A dual algorithm for the solution of nonlinear variational problems via finite element approximation
- On the Numerical Solution of Heat Conduction Problems in Two and Three Space Variables
- Splitting Algorithms for the Sum of Two Nonlinear Operators
- Title not available (Why is that?)
- Convex analysis and monotone operator theory in Hilbert spaces
- On the Douglas-Rachford splitting method and the proximal point algorithm for maximal monotone operators
- The Split Bregman Method for L1-Regularized Problems
- Split Bregman method for large scale fused Lasso
- Proximal splitting methods in signal processing
- Methods of descent for nondifferentiable optimization
- Nonlinear optimization.
- A regularized decomposition method for minimizing a sum of polyhedral functions
- Monotone Operators and the Proximal Point Algorithm
- Title not available (Why is that?)
- A splitting algorithm for dual monotone inclusions involving cocoercive operators
- Title not available (Why is that?)
- Primal-dual splitting algorithm for solving inclusions with mixtures of composite, Lipschitzian, and parallel-sum type monotone operators
- Iterative construction of the resolvent of a sum of maximal monotone operators
- The Numerical Solution of Parabolic and Elliptic Differential Equations
- Scenarios and Policy Aggregation in Optimization Under Uncertainty
- Proximal Decomposition Via Alternating Linearization
- Title not available (Why is that?)
Cited In (6)
- Proximal alternating penalty algorithms for nonsmooth constrained convex optimization
- Selective linearization for multi-block statistical learning
- An outer-inner linearization method for non-convex and nondifferentiable composite regularization problems
- An ADMM algorithm for two-stage stochastic programming problems
- An adaptive partial linearization method for optimization problems on product sets
- An Efficient Algorithm for Minimizing Multi Non-Smooth Component Functions
Uses Software
This page was built for publication: A selective linearization method for multiblock convex optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5266536)