Universal gradient methods for convex optimization problems

From MaRDI portal
Revision as of 05:10, 30 January 2024 by Import240129110155 (talk | contribs) (Created automatically from import240129110155)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:494332

DOI10.1007/S10107-014-0790-0zbMath1327.90216OpenAlexW1985240368MaRDI QIDQ494332

Yong-Cai Geng, Sumit K. Garg

Publication date: 31 August 2015

Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)

Full work available at URL: http://uclouvain.be/cps/ucl/doc/core/documents/coredp2013_26web.pdf






Related Items (89)

An adaptive analog of Nesterov's method for variational inequalities with a strongly monotone operatorFirst-order optimization algorithms via inertial systems with Hessian driven dampingOSGA: a fast subgradient algorithm with optimal complexityOn the global convergence rate of the gradient descent method for functions with Hölder continuous gradientsNew results on subgradient methods for strongly convex optimization problems with a unified analysisGeneralized mirror prox algorithm for monotone variational inequalities: Universality and inexact oracleZeroth-order methods for noisy Hölder-gradient functionsAdaptivity of Stochastic Gradient Methods for Nonconvex OptimizationAccelerated Extra-Gradient Descent: A Novel Accelerated First-Order MethodPrimal–dual accelerated gradient methods with small-dimensional relaxation oracleSmoothness parameter of power of Euclidean normAccelerated schemes for a class of variational inequalitiesDual approaches to the minimization of strongly convex functionals with a simple structure under affine constraintsFast gradient methods for uniformly convex and weakly smooth problemsAn optimal subgradient algorithm with subspace search for costly convex optimization problemsStopping rules for gradient methods for non-convex problems with additive noise in gradientEmpirical risk minimization: probabilistic complexity and stepsize strategyCyclic Coordinate Dual Averaging with ExtrapolationUniversal Conditional Gradient Sliding for Convex OptimizationOptimal subgradient algorithms for large-scale convex optimization in simple domainsGradient-Type Methods for Optimization Problems with Polyak-Łojasiewicz Condition: Early Stopping and Adaptivity to Inexactness ParameterEfficiency of the Accelerated Coordinate Descent Method on Structured Optimization ProblemsRadial duality. II: Applications and algorithmsOptimal Algorithms for Stochastic Complementary Composite MinimizationMultistage transportation model and sufficient conditions for its potentialityPerturbed Fenchel duality and first-order methodsSome adaptive first-order methods for variational inequalities with relatively strongly monotone operators and generalized smoothnessFirst-order methods for convex optimizationOn the computational efficiency of subgradient methods: a case study with Lagrangian boundsOn optimal universal first-order methods for minimizing heterogeneous sumsOn the Adaptivity of Stochastic Gradient-Based OptimizationUnnamed ItemA simple nearly optimal restart scheme for speeding up first-order methodsThe impact of noise on evaluation complexity: the deterministic trust-region caseGeneral Hölder smooth convergence rates follow from specialized rates assuming growth boundsOptimal Affine-Invariant Smooth Minimization AlgorithmsUnnamed ItemAccelerated first-order methods for hyperbolic programmingConditional gradient type methods for composite nonlinear and stochastic optimizationStochastic Model-Based Minimization of Weakly Convex FunctionsUniversal Regularization Methods: Varying the Power, the Smoothness and the AccuracyThe Approximate Duality Gap Technique: A Unified Theory of First-Order MethodsA universal modification of the linear coupling methodImplementable tensor methods in unconstrained convex optimizationNon-monotone Behavior of the Heavy Ball MethodInexact reduced gradient methods in nonconvex optimizationOn the quality of first-order approximation of functions with Hölder continuous gradientUniversal method for stochastic composite optimization problemsSolving structured nonsmooth convex optimization with complexity \(\mathcal {O}(\varepsilon ^{-1/2})\)Regularized Newton Methods for Minimizing Functions with Hölder Continuous HessiansAn introduction to continuous optimization for imagingFast gradient descent for convex minimization problems with an oracle producing a \(( \delta, L)\)-model of function at the requested pointComplexity bounds for primal-dual methods minimizing the model of objective functionAn accelerated directional derivative method for smooth stochastic convex optimizationNearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approachOn the properties of the method of minimization for convex functions with relaxation on the distance to extremumRegularized nonlinear accelerationThe backtrack Hölder gradient method with application to min-max and min-min problemsGradient descent in the absence of global Lipschitz continuity of the gradientsComplexity-optimal and parameter-free first-order methods for finding stationary points of composite optimization problemsDecentralized and parallel primal and dual accelerated methods for stochastic convex programming problemsSharpness, Restart, and AccelerationAccelerated Bregman proximal gradient methods for relatively smooth convex optimizationComplementary composite minimization, small gradients in general norms, and applicationsHigh-order methods beyond the classical complexity bounds: inexact high-order proximal-point methodsEssentials of numerical nonsmooth optimizationA block inertial Bregman proximal algorithm for nonsmooth nonconvex problems with application to symmetric nonnegative matrix tri-factorizationHigh-probability complexity bounds for non-smooth stochastic convex optimization with heavy-tailed noiseUniversal nonmonotone line search method for nonconvex multiobjective optimization problems with convex constraintsGeneralized Conditional Gradient with Augmented Lagrangian for Composite MinimizationAccelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexityAccelerated primal-dual gradient descent with linesearch for convex, nonconvex, and nonsmooth optimization problemsA family of subgradient-based methods for convex optimization problems in a unifying frameworkUniversal method of searching for equilibria and stochastic equilibria in transportation networksOptimal subgradient methods: computational properties for large-scale linear inverse problemsGeneralized uniformly optimal methods for nonlinear programmingA Subgradient Method for Free Material DesignLinear Coupling: An Ultimate Unification of Gradient and Mirror DescentQuasi-convex feasibility problems: subgradient methods and convergence ratesGeneralized Nesterov's accelerated proximal gradient algorithms with convergence rate of order \(o(1/k^2)\)Efficiency of minimizing compositions of convex functions and smooth mapsAn adaptive proximal method for variational inequalitiesUnified Acceleration of High-Order Algorithms under General Hölder ContinuityThe method of codifferential descent for convex and global piecewise affine optimizationA dual approach for optimal algorithms in distributed optimization over networksEssentials of numerical nonsmooth optimizationInexact model: a framework for optimization and variational inequalitiesUniversal intermediate gradient method for convex problems with inexact oracleConvex optimization with inexact gradients in Hilbert space and applications to elliptic inverse problems


Uses Software



Cites Work




This page was built for publication: Universal gradient methods for convex optimization problems