Fast primal-dual gradient method for strongly convex minimization problems with linear constraints
From MaRDI portal
Publication:3133230
Abstract: In this paper we consider a class of optimization problems with a strongly convex objective function and the feasible set given by an intersection of a simple convex set with a set given by a number of linear equality and inequality constraints. A number of optimization problems in applications can be stated in this form, examples being the entropy-linear programming, the ridge regression, the elastic net, the regularized optimal transport, etc. We extend the Fast Gradient Method applied to the dual problem in order to make it primal-dual so that it allows not only to solve the dual problem, but also to construct nearly optimal and nearly feasible solution of the primal problem. We also prove a theorem about the convergence rate for the proposed algorithm in terms of the objective function and the linear constraints infeasibility.
Recommendations
- A fast dual proximal gradient algorithm for convex minimization and applications
- Dual approaches to the minimization of strongly convex functionals with a simple structure under affine constraints
- Fast primal-dual algorithm via dynamical system for a linearly constrained convex optimization problem
- Inertial accelerated primal-dual methods for linear equality constrained convex optimization problems
- Fast augmented Lagrangian method in the convex regime with convergence guarantees for the iterates
Cited in
(22)- Convex optimization with inexact gradients in Hilbert space and applications to elliptic inverse problems
- Decentralized and parallel primal and dual accelerated methods for stochastic convex programming problems
- Stochastic saddle-point optimization for the Wasserstein barycenter problem
- Universal intermediate gradient method for convex problems with inexact oracle
- Certification aspects of the fast gradient method for solving the dual of parametric convex programs
- Numerical methods for the resource allocation problem in a computer network
- A fast dual proximal gradient algorithm for convex minimization and applications
- Inexact model: a framework for optimization and variational inequalities
- Near-optimal tensor methods for minimizing the gradient norm of convex functions and accelerated primal–dual tensor methods
- An accelerated directional derivative method for smooth stochastic convex optimization
- Composite optimization for the resource allocation problem
- Fast proximity-gradient algorithms for structured convex optimization problems
- Accelerated dual-averaging primal–dual method for composite convex minimization
- Universal method of searching for equilibria and stochastic equilibria in transportation networks
- Dual approaches to the minimization of strongly convex functionals with a simple structure under affine constraints
- A dual approach for optimal algorithms in distributed optimization over networks
- On numerical estimates of errors in solving convex optimization problems
- First-order methods for convex optimization
- Accelerated primal-dual gradient descent with linesearch for convex, nonconvex, and nonsmooth optimization problems
- Primal-dual accelerated gradient methods with small-dimensional relaxation oracle
- Alternating minimization methods for strongly convex optimization
- Accelerated primal-dual proximal block coordinate updating methods for constrained convex optimization
This page was built for publication: Fast primal-dual gradient method for strongly convex minimization problems with linear constraints
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3133230)