A smooth primal-dual optimization framework for nonsmooth composite convex minimization
DOI10.1137/16M1093094zbMATH Open1386.90109arXiv1507.06243OpenAlexW2962713896MaRDI QIDQ4600841FDOQ4600841
Authors: Quoc Tran Dinh, Olivier Fercoq, Volkan Cevher
Publication date: 17 January 2018
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1507.06243
Recommendations
- An adaptive primal-dual framework for nonsmooth convex minimization
- Adaptive smoothing algorithms for nonsmooth composite convex minimization
- Smoothing and first order methods: a unified framework
- Dualize, split, randomize: toward fast nonsmooth optimization algorithms
- Excessive Gap Technique in Nonsmooth Convex Minimization
augmented Lagrangianhomotopyseparable convex minimizationsmoothing techniquesparallel and distributed computationgap reduction techniquefirst-order primal-dual methods
Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Computational methods for problems pertaining to operations research and mathematical programming (90-08)
Cites Work
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Title not available (Why is that?)
- Distributed optimization and statistical learning via the alternating direction method of multipliers
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Variational Analysis
- Title not available (Why is that?)
- Convex analysis and monotone operator theory in Hilbert spaces
- Smooth minimization of non-smooth functions
- Introductory lectures on convex optimization. A basic course.
- Adaptive restart for accelerated gradient schemes
- On the global and linear convergence of the generalized alternating direction method of multipliers
- On the \(O(1/n)\) convergence rate of the Douglas-Rachford alternating direction method
- Lectures on modern convex optimization. Analysis, algorithms, and engineering applications
- Prox-Method with Rate of Convergence O(1/t) for Variational Inequalities with Lipschitz Continuous Monotone Operators and Smooth Convex-Concave Saddle Point Problems
- Excessive Gap Technique in Nonsmooth Convex Minimization
- Convex Analysis
- Title not available (Why is that?)
- The direct extension of ADMM for multi-block convex minimization problems is not necessarily convergent
- Applications of a Splitting Algorithm to Decomposition in Convex Programming and Variational Inequalities
- Proximal splitting methods in signal processing
- A proximal-based deomposition method for compositions method for convex minimization problems
- A hybrid approximate extragradient-proximal point algorithm using the enlargement of a maximal monotone operator
- A first-order primal-dual algorithm for convex problems with applications to imaging
- On the complexity of the hybrid proximal extragradient method for the iterates and the ergodic mean
- Complexity of Variants of Tseng's Modified F-B Splitting and Korpelevich's Methods for Hemivariational Inequalities with Applications to Saddle-point and Convex Optimization Problems
- Title not available (Why is that?)
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- Application of a Smoothing Technique to Decomposition in Convex Optimization
- Computational complexity of inexact gradient augmented Lagrangian methods: application to constrained MPC
- Iteration-complexity of block-decomposition algorithms and the alternating direction method of multipliers
- Iteration-complexity of first-order penalty methods for convex programming
- Convergence analysis of primal-dual algorithms for a saddle-point problem: from contraction perspective
- A primal-dual splitting method for convex optimization involving Lipschitzian, proximable and linear composite terms
- Title not available (Why is that?)
- Primal-dual splitting algorithm for solving inclusions with mixtures of composite, Lipschitzian, and parallel-sum type monotone operators
- Dual extrapolation and its applications to solving variational inequalities and related problems
- A Variable Metric Extension of the Forward–Backward–Forward Algorithm for Monotone Operators
- The convex geometry of linear inverse problems
- Rate of Convergence Analysis of Decomposition Methods Based on the Proximal Method of Multipliers for Convex Minimization
- Parallel multi-block ADMM with \(o(1/k)\) convergence
- On non-ergodic convergence rate of Douglas-Rachford alternating direction method of multipliers
- Solving Multiple-Block Separable Convex Minimization Problems Using Two-Block Alternating Direction Method of Multipliers
- Smoothing and first order methods: a unified framework
- Optimal primal-dual methods for a class of saddle point problems
- A variable smoothing algorithm for solving convex optimization problems
- On the Global Linear Convergence of the ADMM with MultiBlock Variables
- On the sublinear convergence rate of multi-block ADMM
- A first-order primal-dual algorithm with linesearch
- Iteration-complexity of first-order augmented Lagrangian methods for convex programming
- Composite self-concordant minimization
- Iteration complexity analysis of dual first-order methods for conic convex programming
- Convergence rate analysis of the forward-Douglas-Rachford splitting scheme
- Convergence Rate Analysis of Primal-Dual Splitting Schemes
- An accelerated linearized alternating direction method of multipliers
- An accelerated HPE-type algorithm for a class of composite convex-concave saddle-point problems
- Faster convergence rates of relaxed Peaceman-Rachford and ADMM under regularity assumptions
- A fast dual proximal gradient algorithm for convex minimization and applications
- Restarting the accelerated coordinate descent method with a rough strong convexity estimate
- Smoothing alternating direction methods for fully nonsmooth constrained convex optimization
Cited In (29)
- New primal-dual algorithms for a class of nonsmooth and nonlinear convex-concave minimax problems
- Smoothed Variable Sample-Size Accelerated Proximal Methods for Nonsmooth Stochastic Convex Programs
- A generic coordinate descent solver for non-smooth convex optimisation
- Accelerated primal-dual methods with adaptive parameters for composite convex optimization with linear constraints
- First-order methods for convex optimization
- A primal-dual flow for affine constrained convex optimization
- A coordinate-descent primal-dual algorithm with large step size and possibly nonseparable functions
- Fast augmented Lagrangian method in the convex regime with convergence guarantees for the iterates
- Variable smoothing for convex optimization problems using stochastic gradients
- A unified convergence rate analysis of the accelerated smoothed gap reduction algorithm
- Quadratic error bound of the smoothed gap and the restarted averaged primal-dual hybrid gradient
- A new homotopy proximal variable-metric framework for composite convex minimization
- Proximal alternating penalty algorithms for nonsmooth constrained convex optimization
- Accelerated dual-averaging primal–dual method for composite convex minimization
- Reducing the Complexity of Two Classes of Optimization Problems by Inexact Accelerated Proximal Gradient Method
- First-order primal-dual methods for nonsmooth non-convex optimization
- An adaptive primal-dual framework for nonsmooth convex minimization
- Variable smoothing for weakly convex composite functions
- The operator splitting schemes revisited: primal-dual gap and degeneracy reduction by a unified analysis
- Random minibatch subgradient algorithms for convex problems with functional constraints
- On the convergence of stochastic primal-dual hybrid gradient
- An efficient primal dual prox method for non-smooth optimization
- Non-ergodic convergence rate of an inertial accelerated primal-dual algorithm for saddle point problems
- A dual approach for optimal algorithms in distributed optimization over networks
- An inexact proximal augmented Lagrangian framework with arbitrary linearly convergent inner solver for composite convex optimization
- Near-optimal tensor methods for minimizing the gradient norm of convex functions and accelerated primal–dual tensor methods
- Non-stationary First-Order Primal-Dual Algorithms with Faster Convergence Rates
- A primal-dual smoothing framework for max-structured non-convex optimization
- A new randomized primal-dual algorithm for convex optimization with fast last iterate convergence rates
This page was built for publication: A smooth primal-dual optimization framework for nonsmooth composite convex minimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4600841)