Excessive Gap Technique in Nonsmooth Convex Minimization

From MaRDI portal
Publication:5317557

DOI10.1137/S1052623403422285zbMath1096.90026OpenAlexW2000955051MaRDI QIDQ5317557

No author found.

Publication date: 16 September 2005

Published in: SIAM Journal on Optimization (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1137/s1052623403422285



Related Items

Reducing the Complexity of Two Classes of Optimization Problems by Inexact Accelerated Proximal Gradient Method, New Primal-Dual Algorithms for a Class of Nonsmooth and Nonlinear Convex-Concave Minimax Problems, Accelerated gradient sliding for structured convex optimization, New results on subgradient methods for strongly convex optimization problems with a unified analysis, Gradient methods and conic least-squares problems, Algorithms and software for total variation image reconstruction via first-order methods, Structured Sparsity: Discrete and Convex Approaches, Soft clustering by convex electoral model, Combining Lagrangian decomposition and excessive gap smoothing technique for solving large-scale separable convex optimization problems, Accelerated Extra-Gradient Descent: A Novel Accelerated First-Order Method, Dual extrapolation and its applications to solving variational inequalities and related problems, Accelerated Stochastic Algorithms for Convex-Concave Saddle-Point Problems, Smoothing technique and its applications in semidefinite optimization, Approximation accuracy, gradient methods, and error bound for structured convex optimization, Accelerated training of max-margin Markov networks with kernels, First-order methods of smooth convex optimization with inexact oracle, Optimal subgradient algorithms for large-scale convex optimization in simple domains, Accelerated variance-reduced methods for saddle-point problems, Robust Accelerated Primal-Dual Methods for Computing Saddle Points, Conjugate gradient type methods for the nondifferentiable convex minimization, First-order methods for convex optimization, A Smooth Primal-Dual Optimization Framework for Nonsmooth Composite Convex Minimization, On the computational efficiency of subgradient methods: a case study with Lagrangian bounds, An inexact primal-dual smoothing framework for large-scale non-bilinear saddle point problems, Barrier subgradient method, A double smoothing technique for solving unconstrained nondifferentiable convex optimization problems, First-order algorithm with \({\mathcal{O}(\ln(1/\epsilon))}\) convergence for \({\epsilon}\)-equilibrium in two-person zero-sum games, Unnamed Item, Robust least square semidefinite programming with applications, An adaptive primal-dual framework for nonsmooth convex minimization, Nesterov's smoothing and excessive gap methods for an optimization problem in VLSI placement, Adaptive inexact fast augmented Lagrangian methods for constrained convex optimization, Solving structured nonsmooth convex optimization with complexity \(\mathcal {O}(\varepsilon ^{-1/2})\), Adaptive smoothing algorithms for nonsmooth composite convex minimization, Regret bounded by gradual variation for online convex optimization, Solving nearly-separable quadratic optimization problems as nonsmooth equations, Suppressing homoclinic chaos for a weak periodically excited non-smooth oscillator, Fast inexact decomposition algorithms for large-scale separable convex optimization, Primal-dual subgradient methods for convex problems, Counterfactual regret minimization for integrated cyber and air defense resource allocation, Faster algorithms for extensive-form game solving via improved smoothing functions, Bundle-level type methods uniformly optimal for smooth and nonsmooth convex optimization, Near-optimal no-regret algorithms for zero-sum games, Rounding of convex sets and efficient gradient methods for linear programming problems, Unnamed Item, Convex relaxations of penalties for sparse correlated variables with bounded total variation, Non-stationary First-Order Primal-Dual Algorithms with Faster Convergence Rates, Accelerated First-Order Primal-Dual Proximal Methods for Linearly Constrained Composite Convex Programming, Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity, On the acceleration of the double smoothing technique for unconstrained convex optimization problems, A family of subgradient-based methods for convex optimization problems in a unifying framework, Solving Large-Scale Optimization Problems with a Convergence Rate Independent of Grid Size, Solving variational inequalities with Stochastic Mirror-Prox algorithm, An Accelerated Linearized Alternating Direction Method of Multipliers, An efficient primal dual prox method for non-smooth optimization, A variable smoothing algorithm for solving convex optimization problems, Deterministic and stochastic primal-dual subgradient algorithms for uniformly convex minimization, Smoothed Variable Sample-Size Accelerated Proximal Methods for Nonsmooth Stochastic Convex Programs, Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing