A flexible ADMM algorithm for big data applications
From MaRDI portal
Numerical mathematical programming methods (65K05) Numerical optimization and variational techniques (65K10) Complexity and performance of numerical algorithms (65Y20) Analysis of algorithms and problem complexity (68Q25) Nonlinear programming (90C30) Numerical methods based on nonlinear programming (49M37) Abstract computational complexity for mathematical programming problems (90C60) Newton-type methods (49M15) Implicit function theorems; global Newton methods on manifolds (58C15)
Abstract: We present a flexible Alternating Direction Method of Multipliers (F-ADMM) algorithm for solving optimization problems involving a strongly convex objective function that is separable into blocks, subject to (non-separable) linear equality constraints. The F-ADMM algorithm uses a Gauss-Seidel scheme to update blocks of variables, and a regularization term is added to each of the subproblems arising within F-ADMM. We prove, under common assumptions, that F-ADMM is globally convergent. We also present a special case of F-ADMM that is partially parallelizable, which makes it attractive in a big data setting. In particular, we partition the data into groups, so that each group consists of multiple blocks of variables. By applying F-ADMM to this partitioning of the data, and using a specific regularization matrix, we obtain a hybrid ADMM (H-ADMM) algorithm: the grouped data is updated in a Gauss-Seidel fashion, and the blocks within each group are updated in a Jacobi manner. Convergence of H-ADMM follows directly from the convergence properties of F-ADMM. Also, a special case of H-ADMM can be applied to functions that are convex, rather than strongly convex. We present numerical experiments to demonstrate the practical advantages of this algorithm.
Recommendations
- Parallel multi-block ADMM with \(o(1/k)\) convergence
- Generalized symmetric ADMM for separable convex optimization
- An inexact accelerated stochastic ADMM for separable convex optimization
- Understanding the convergence of the alternating direction method of multipliers: theoretical and computational perspectives
- Implementing the alternating direction method of multipliers for big datasets: a case study of least absolute shrinkage and selection operator
Cites work
- scientific article; zbMATH DE number 3574917 (Why is no real title available?)
- scientific article; zbMATH DE number 6142618 (Why is no real title available?)
- A dual algorithm for the solution of nonlinear variational problems via finite element approximation
- A note on the alternating direction method of multipliers
- A parallel line search subspace correction method for composite convex optimization
- Alternating direction algorithms for \(\ell_1\)-problems in compressive sensing
- Augmented \(\ell_1\) and nuclear-norm models with a globally linearly convergent algorithm
- Block-wise ADMM with a relaxation factor for multiple-block convex programming
- Block-wise alternating direction method of multipliers for multiple-block convex programming and beyond
- Convergence analysis of alternating direction method of multipliers for a family of nonconvex problems
- Convex Analysis
- Distributed optimization and statistical learning via the alternating direction method of multipliers
- Extended monotropic programming and duality
- Local linear convergence of the alternating direction method of multipliers on quadratic or linear programs
- On the Douglas-Rachford splitting method and the proximal point algorithm for maximal monotone operators
- On the Global Linear Convergence of the ADMM with MultiBlock Variables
- On the global and linear convergence of the generalized alternating direction method of multipliers
- Parallel multi-block ADMM with \(o(1/k)\) convergence
- Splitting Algorithms for the Sum of Two Nonlinear Operators
- The direct extension of ADMM for multi-block convex minimization problems is not necessarily convergent
- Understanding the convergence of the alternating direction method of multipliers: theoretical and computational perspectives
- Variational Analysis
Cited in
(9)- Implementing the alternating direction method of multipliers for big datasets: a case study of least absolute shrinkage and selection operator
- Hybrid MPI/OpenMP parallel asynchronous distributed alternating direction method of multipliers
- First-order algorithms for convex optimization with nonseparable objective and coupled constraints
- Generalized symmetric ADMM for separable convex optimization
- An inexact accelerated stochastic ADMM for separable convex optimization
- Parallel multi-block ADMM with \(o(1/k)\) convergence
- An ADMM algorithm for two-stage stochastic programming problems
- Poisson noise removal based on non-convex hybrid regularizers
- A framework for parallel second order incremental optimization algorithms for solving partially separable problems
This page was built for publication: A flexible ADMM algorithm for big data applications
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1704789)