Fast primal-dual gradient method for strongly convex minimization problems with linear constraints
From MaRDI portal
Publication:3133230
DOI10.1007/978-3-319-44914-2_31zbMATH Open1391.90471arXiv1605.02970OpenAlexW2963043933MaRDI QIDQ3133230FDOQ3133230
Authors: A. V. Chernov, Pavel Dvurechensky, Alexander V. Gasnikov
Publication date: 13 February 2018
Published in: Discrete Optimization and Operations Research (Search for Journal in Brave)
Abstract: In this paper we consider a class of optimization problems with a strongly convex objective function and the feasible set given by an intersection of a simple convex set with a set given by a number of linear equality and inequality constraints. A number of optimization problems in applications can be stated in this form, examples being the entropy-linear programming, the ridge regression, the elastic net, the regularized optimal transport, etc. We extend the Fast Gradient Method applied to the dual problem in order to make it primal-dual so that it allows not only to solve the dual problem, but also to construct nearly optimal and nearly feasible solution of the primal problem. We also prove a theorem about the convergence rate for the proposed algorithm in terms of the objective function and the linear constraints infeasibility.
Full work available at URL: https://arxiv.org/abs/1605.02970
Recommendations
- A fast dual proximal gradient algorithm for convex minimization and applications
- Dual approaches to the minimization of strongly convex functionals with a simple structure under affine constraints
- Fast primal-dual algorithm via dynamical system for a linearly constrained convex optimization problem
- Inertial accelerated primal-dual methods for linear equality constrained convex optimization problems
- Fast augmented Lagrangian method in the convex regime with convergence guarantees for the iterates
Cited In (22)
- Inexact model: a framework for optimization and variational inequalities
- Certification aspects of the fast gradient method for solving the dual of parametric convex programs
- First-order methods for convex optimization
- Universal intermediate gradient method for convex problems with inexact oracle
- Accelerated primal-dual proximal block coordinate updating methods for constrained convex optimization
- Convex optimization with inexact gradients in Hilbert space and applications to elliptic inverse problems
- Numerical methods for the resource allocation problem in a computer network
- Accelerated dual-averaging primal–dual method for composite convex minimization
- Primal-dual accelerated gradient methods with small-dimensional relaxation oracle
- An accelerated directional derivative method for smooth stochastic convex optimization
- Fast proximity-gradient algorithms for structured convex optimization problems
- Decentralized and parallel primal and dual accelerated methods for stochastic convex programming problems
- Stochastic saddle-point optimization for the Wasserstein barycenter problem
- Universal method of searching for equilibria and stochastic equilibria in transportation networks
- A fast dual proximal gradient algorithm for convex minimization and applications
- A dual approach for optimal algorithms in distributed optimization over networks
- Near-optimal tensor methods for minimizing the gradient norm of convex functions and accelerated primal–dual tensor methods
- Dual approaches to the minimization of strongly convex functionals with a simple structure under affine constraints
- On numerical estimates of errors in solving convex optimization problems
- Composite optimization for the resource allocation problem
- Alternating minimization methods for strongly convex optimization
- Accelerated primal-dual gradient descent with linesearch for convex, nonconvex, and nonsmooth optimization problems
This page was built for publication: Fast primal-dual gradient method for strongly convex minimization problems with linear constraints
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3133230)