Primal-dual first-order methods with O(1/) iteration-complexity for cone programming
DOI10.1007/S10107-008-0261-6zbMATH Open1208.90113OpenAlexW2042860556MaRDI QIDQ623454FDOQ623454
Authors: Guanghui Lan, Zhaosong Lu, Renato D. C. Monteiro
Publication date: 14 February 2011
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10107-008-0261-6
Recommendations
- Primal-dual first-order methods for a class of cone programming
- On iteration complexity of a first-order primal-dual method for nonlinear convex cone programming
- Iteration complexity analysis of dual first-order methods for conic convex programming
- Iteration-complexity of first-order penalty methods for convex programming
- First- and second-order methods for semidefinite programming
Numerical mathematical programming methods (65K05) Numerical optimization and variational techniques (65K10) Convex programming (90C25) Linear programming (90C05) Semidefinite programming (90C22)
Cites Work
- A nonlinear programming algorithm for solving semidefinite programs via low-rank factorization
- Solving semidefinite-quadratic-linear programs using SDPT3
- Smooth minimization of non-smooth functions
- Prox-Method with Rate of Convergence O(1/t) for Variational Inequalities with Lipschitz Continuous Monotone Operators and Smooth Convex-Concave Saddle Point Problems
- Smooth Optimization with Approximate Gradient
- Title not available (Why is that?)
- Local minima and convergence in low-rank semidefinite programming
- Interior Gradient and Proximal Methods for Convex and Conic Optimization
- Smoothing technique and its applications in semidefinite optimization
- Large-scale semidefinite programming via a saddle point mirror-prox algorithm
Cited In (65)
- An inertial projection and contraction algorithm for pseudomonotone variational inequalities without Lipschitz continuity
- Two classes of spectral three-term derivative-free method for solving nonlinear equations with application
- Faster first-order primal-dual methods for linear programming using restarts and sharpness
- Infeasibility Detection with Primal-Dual Hybrid Gradient for Large-Scale Linear Programming
- Policy Mirror Descent for Regularized Reinforcement Learning: A Generalized Framework with Linear Convergence
- Accelerated first-order methods for a class of semidefinite programs
- A family of subgradient-based methods for convex optimization problems in a unifying framework
- Environmental game modeling with uncertainties
- An \(O(\sqrt {n} L)\) iteration bound primal-dual cone affine scaling algorithm for linear programming
- Algorithms for stochastic optimization with function or expectation constraints
- A preconditioning technique for first-order primal-dual splitting method in convex optimization
- Approximation accuracy, gradient methods, and error bound for structured convex optimization
- An extrapolated iteratively reweighted \(\ell_1\) method with complexity analysis
- On inexact solution of auxiliary problems in tensor methods for convex optimization
- Super-resolution of positive sources: the discrete setup
- An adaptive accelerated first-order method for convex optimization
- Iteration-Complexity of First-Order Augmented Lagrangian Methods for Convex Conic Programming
- A level-set method for convex optimization with a feasible solution path
- Adaptive restart for accelerated gradient schemes
- An alternating direction method for finding Dantzig selectors
- Fastest rates for stochastic mirror descent methods
- An accelerated linearized alternating direction method of multipliers
- Matrix-free convex optimization modeling
- Dynamic stochastic approximation for multi-stage stochastic optimization
- An optimal randomized incremental gradient method
- The supporting halfspace-quadratic programming strategy for the dual of the best approximation problem
- An exact penalty method for nonconvex problems covering, in particular, nonlinear programming, semidefinite programming, and second-order cone programming
- OSGA: a fast subgradient algorithm with optimal complexity
- A primal-dual homotopy algorithm for \(\ell _{1}\)-minimization with \(\ell _{\infty }\)-constraints
- Polyhedral approximations in \(p\)-order cone programming
- Iteration-complexity of first-order penalty methods for convex programming
- Iteration complexity analysis of dual first-order methods for conic convex programming
- Iteratively reweighted \(\ell _1\) algorithms with extrapolation
- Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity
- Random gradient extrapolation for distributed and stochastic optimization
- Fine tuning Nesterov's steepest descent algorithm for differentiable convex programming
- An average curvature accelerated composite gradient method for nonconvex smooth composite optimization problems
- Proportional-integral projected gradient method for conic optimization
- A FISTA-type accelerated gradient algorithm for solving smooth nonconvex composite optimization problems
- Accelerating block-decomposition first-order methods for solving composite saddle-point and two-player Nash equilibrium problems
- On iteration complexity of a first-order primal-dual method for nonlinear convex cone programming
- A new convergence analysis and perturbation resilience of some accelerated proximal forward-backward algorithms with errors
- An efficient primal dual prox method for non-smooth optimization
- A dual approach for optimal algorithms in distributed optimization over networks
- Optimal subgradient algorithms for large-scale convex optimization in simple domains
- Implementation of an optimal first-order method for strongly convex total variation regularization
- Primal-dual first-order methods for a class of cone programming
- Solving structured nonsmooth convex optimization with complexity \(\mathcal {O}(\varepsilon ^{-1/2})\)
- Conic optimization via operator splitting and homogeneous self-dual embedding
- On the convergence properties of non-Euclidean extragradient methods for variational inequalities with generalized monotone operators
- Complexity of first-order inexact Lagrangian and penalty methods for conic convex programming
- Bregman proximal point algorithm revisited: a new inexact version and its inertial variant
- Smoothing proximal gradient method for general structured sparse regression
- A smoothing stochastic gradient method for composite optimization
- An optimal subgradient algorithm with subspace search for costly convex optimization problems
- A very simple SQCQP method for a class of smooth convex constrained minimization problems with nice convergence results
- Performance of first-order methods for smooth convex minimization: a novel approach
- A multi-step doubly stabilized bundle method for nonsmooth convex optimization
- Bundle-level type methods uniformly optimal for smooth and nonsmooth convex optimization
- Templates for convex cone problems with applications to sparse signal recovery
- A characterization theorem and an algorithm for a convex hull problem
- An extended sequential quadratically constrained quadratic programming algorithm for nonlinear, semidefinite, and second-order cone programming
- Exact gradient methods with memory
- Accelerated gradient sliding for structured convex optimization
- Accelerated first-order methods for hyperbolic programming
Uses Software
This page was built for publication: Primal-dual first-order methods with \({\mathcal {O}(1/\varepsilon)}\) iteration-complexity for cone programming
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q623454)