Excessive Gap Technique in Nonsmooth Convex Minimization

From MaRDI portal
Revision as of 22:35, 8 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:5317557

DOI10.1137/S1052623403422285zbMath1096.90026OpenAlexW2000955051MaRDI QIDQ5317557

No author found.

Publication date: 16 September 2005

Published in: SIAM Journal on Optimization (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1137/s1052623403422285




Related Items (59)

Reducing the Complexity of Two Classes of Optimization Problems by Inexact Accelerated Proximal Gradient MethodNew Primal-Dual Algorithms for a Class of Nonsmooth and Nonlinear Convex-Concave Minimax ProblemsAccelerated gradient sliding for structured convex optimizationNew results on subgradient methods for strongly convex optimization problems with a unified analysisGradient methods and conic least-squares problemsAlgorithms and software for total variation image reconstruction via first-order methodsStructured Sparsity: Discrete and Convex ApproachesSoft clustering by convex electoral modelCombining Lagrangian decomposition and excessive gap smoothing technique for solving large-scale separable convex optimization problemsAccelerated Extra-Gradient Descent: A Novel Accelerated First-Order MethodDual extrapolation and its applications to solving variational inequalities and related problemsAccelerated Stochastic Algorithms for Convex-Concave Saddle-Point ProblemsSmoothing technique and its applications in semidefinite optimizationApproximation accuracy, gradient methods, and error bound for structured convex optimizationAccelerated training of max-margin Markov networks with kernelsFirst-order methods of smooth convex optimization with inexact oracleOptimal subgradient algorithms for large-scale convex optimization in simple domainsAccelerated variance-reduced methods for saddle-point problemsRobust Accelerated Primal-Dual Methods for Computing Saddle PointsConjugate gradient type methods for the nondifferentiable convex minimizationFirst-order methods for convex optimizationA Smooth Primal-Dual Optimization Framework for Nonsmooth Composite Convex MinimizationOn the computational efficiency of subgradient methods: a case study with Lagrangian boundsAn inexact primal-dual smoothing framework for large-scale non-bilinear saddle point problemsBarrier subgradient methodA double smoothing technique for solving unconstrained nondifferentiable convex optimization problemsFirst-order algorithm with \({\mathcal{O}(\ln(1/\epsilon))}\) convergence for \({\epsilon}\)-equilibrium in two-person zero-sum gamesUnnamed ItemRobust least square semidefinite programming with applicationsAn adaptive primal-dual framework for nonsmooth convex minimizationNesterov's smoothing and excessive gap methods for an optimization problem in VLSI placementAdaptive inexact fast augmented Lagrangian methods for constrained convex optimizationSolving structured nonsmooth convex optimization with complexity \(\mathcal {O}(\varepsilon ^{-1/2})\)Adaptive smoothing algorithms for nonsmooth composite convex minimizationRegret bounded by gradual variation for online convex optimizationSolving nearly-separable quadratic optimization problems as nonsmooth equationsSuppressing homoclinic chaos for a weak periodically excited non-smooth oscillatorFast inexact decomposition algorithms for large-scale separable convex optimizationPrimal-dual subgradient methods for convex problemsCounterfactual regret minimization for integrated cyber and air defense resource allocationFaster algorithms for extensive-form game solving via improved smoothing functionsBundle-level type methods uniformly optimal for smooth and nonsmooth convex optimizationNear-optimal no-regret algorithms for zero-sum gamesRounding of convex sets and efficient gradient methods for linear programming problemsUnnamed ItemConvex relaxations of penalties for sparse correlated variables with bounded total variationNon-stationary First-Order Primal-Dual Algorithms with Faster Convergence RatesAccelerated First-Order Primal-Dual Proximal Methods for Linearly Constrained Composite Convex ProgrammingAccelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexityOn the acceleration of the double smoothing technique for unconstrained convex optimization problemsA family of subgradient-based methods for convex optimization problems in a unifying frameworkSolving Large-Scale Optimization Problems with a Convergence Rate Independent of Grid SizeSolving variational inequalities with Stochastic Mirror-Prox algorithmAn Accelerated Linearized Alternating Direction Method of MultipliersAn efficient primal dual prox method for non-smooth optimizationA variable smoothing algorithm for solving convex optimization problemsDeterministic and stochastic primal-dual subgradient algorithms for uniformly convex minimizationSmoothed Variable Sample-Size Accelerated Proximal Methods for Nonsmooth Stochastic Convex ProgramsDai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing







This page was built for publication: Excessive Gap Technique in Nonsmooth Convex Minimization