Fast Multiple-Splitting Algorithms for Convex Optimization

From MaRDI portal
Revision as of 21:12, 3 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:2910883

DOI10.1137/090780705zbMath1254.65075arXiv0912.4570OpenAlexW1966881087MaRDI QIDQ2910883

Donald Goldfarb, Shi-Qian Ma

Publication date: 12 September 2012

Published in: SIAM Journal on Optimization (Search for Journal in Brave)

Abstract: We present in this paper two different classes of general $K$-splitting algorithms for solving finite-dimensional convex optimization problems. Under the assumption that the function being minimized has a Lipschitz continuous gradient, we prove that the number of iterations needed by the first class of algorithms to obtain an $epsilon$-optimal solution is $O(1/epsilon)$. The algorithms in the second class are accelerated versions of those in the first class, where the complexity result is improved to $O(1/sqrt{epsilon})$ while the computational effort required at each iteration is almost unchanged. To the best of our knowledge, the complexity results presented in this paper are the first ones of this type that have been given for splitting and alternating direction type methods. Moreover, all algorithms proposed in this paper are parallelizable, which makes them particularly attractive for solving certain large-scale problems.


Full work available at URL: https://arxiv.org/abs/0912.4570






Related Items (31)

A fast dual proximal-gradient method for separable convex optimization with linear coupled constraintsAn augmented Lagrangian based parallel splitting method for separable convex minimization with applications to image processingCombining Lagrangian decomposition and excessive gap smoothing technique for solving large-scale separable convex optimization problemsOn the global and linear convergence of the generalized alternating direction method of multipliersAlternating proximal gradient method for convex minimizationAccelerated Bregman operator splitting with backtrackingFast alternating linearization methods for minimizing the sum of two convex functionsiPiasco: inertial proximal algorithm for strongly convex optimizationAn operator-splitting approach for variational optimal control formulations for diffeomorphic shape matchingInexact alternating direction methods of multipliers for separable convex optimizationAsynchronous Stochastic Coordinate Descent: Parallelism and Convergence PropertiesAlternating direction augmented Lagrangian methods for semidefinite programmingDecomposable Markov Decision Processes: A Fluid Optimization ApproachA linear algebra perspective on the random multi-block ADMM: the QP caseAccelerated linearized Bregman methodA survey on operator splitting and decomposition of convex programsAlternating Direction Methods for Latent Variable Gaussian Graphical Model SelectionAlternating direction method of multipliers for sparse principal component analysisWeighted-average alternating minimization method for magnetic resonance image reconstruction based on compressive sensingMulti-block nonconvex nonsmooth proximal ADMM: convergence and rates under Kurdyka-Łojasiewicz propertyAn alternating direction method of multipliers for tensor complementarity problemsOn the linear convergence of the alternating direction method of multipliersAn introduction to continuous optimization for imagingConvergence rates for an inexact ADMM applied to separable convex optimizationDiffeomorphic shape matching by operator splitting in 3D cardiology imagingFast inexact decomposition algorithms for large-scale separable convex optimizationDualize, split, randomize: toward fast nonsmooth optimization algorithmsOn the Global Linear Convergence of the ADMM with MultiBlock VariablesConvergence and rate analysis of a proximal linearized ADMM for nonconvex nonsmooth optimizationMirror Prox algorithm for multi-term composite minimization and semi-separable problemsLinearized alternating direction method with parallel splitting and adaptive penalty for separable convex programs in machine learning





This page was built for publication: Fast Multiple-Splitting Algorithms for Convex Optimization