Universal gradient methods for convex optimization problems
From MaRDI portal
Publication:494332
DOI10.1007/S10107-014-0790-0zbMath1327.90216OpenAlexW1985240368MaRDI QIDQ494332
Publication date: 31 August 2015
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: http://uclouvain.be/cps/ucl/doc/core/documents/coredp2013_26web.pdf
Analysis of algorithms and problem complexity (68Q25) Convex programming (90C25) Minimax problems in mathematical programming (90C47)
Related Items (89)
An adaptive analog of Nesterov's method for variational inequalities with a strongly monotone operator ⋮ First-order optimization algorithms via inertial systems with Hessian driven damping ⋮ OSGA: a fast subgradient algorithm with optimal complexity ⋮ On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients ⋮ New results on subgradient methods for strongly convex optimization problems with a unified analysis ⋮ Generalized mirror prox algorithm for monotone variational inequalities: Universality and inexact oracle ⋮ Zeroth-order methods for noisy Hölder-gradient functions ⋮ Adaptivity of Stochastic Gradient Methods for Nonconvex Optimization ⋮ Accelerated Extra-Gradient Descent: A Novel Accelerated First-Order Method ⋮ Primal–dual accelerated gradient methods with small-dimensional relaxation oracle ⋮ Smoothness parameter of power of Euclidean norm ⋮ Accelerated schemes for a class of variational inequalities ⋮ Dual approaches to the minimization of strongly convex functionals with a simple structure under affine constraints ⋮ Fast gradient methods for uniformly convex and weakly smooth problems ⋮ An optimal subgradient algorithm with subspace search for costly convex optimization problems ⋮ Stopping rules for gradient methods for non-convex problems with additive noise in gradient ⋮ Empirical risk minimization: probabilistic complexity and stepsize strategy ⋮ Cyclic Coordinate Dual Averaging with Extrapolation ⋮ Universal Conditional Gradient Sliding for Convex Optimization ⋮ Optimal subgradient algorithms for large-scale convex optimization in simple domains ⋮ Gradient-Type Methods for Optimization Problems with Polyak-Łojasiewicz Condition: Early Stopping and Adaptivity to Inexactness Parameter ⋮ Efficiency of the Accelerated Coordinate Descent Method on Structured Optimization Problems ⋮ Radial duality. II: Applications and algorithms ⋮ Optimal Algorithms for Stochastic Complementary Composite Minimization ⋮ Multistage transportation model and sufficient conditions for its potentiality ⋮ Perturbed Fenchel duality and first-order methods ⋮ Some adaptive first-order methods for variational inequalities with relatively strongly monotone operators and generalized smoothness ⋮ First-order methods for convex optimization ⋮ On the computational efficiency of subgradient methods: a case study with Lagrangian bounds ⋮ On optimal universal first-order methods for minimizing heterogeneous sums ⋮ On the Adaptivity of Stochastic Gradient-Based Optimization ⋮ Unnamed Item ⋮ A simple nearly optimal restart scheme for speeding up first-order methods ⋮ The impact of noise on evaluation complexity: the deterministic trust-region case ⋮ General Hölder smooth convergence rates follow from specialized rates assuming growth bounds ⋮ Optimal Affine-Invariant Smooth Minimization Algorithms ⋮ Unnamed Item ⋮ Accelerated first-order methods for hyperbolic programming ⋮ Conditional gradient type methods for composite nonlinear and stochastic optimization ⋮ Stochastic Model-Based Minimization of Weakly Convex Functions ⋮ Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy ⋮ The Approximate Duality Gap Technique: A Unified Theory of First-Order Methods ⋮ A universal modification of the linear coupling method ⋮ Implementable tensor methods in unconstrained convex optimization ⋮ Non-monotone Behavior of the Heavy Ball Method ⋮ Inexact reduced gradient methods in nonconvex optimization ⋮ On the quality of first-order approximation of functions with Hölder continuous gradient ⋮ Universal method for stochastic composite optimization problems ⋮ Solving structured nonsmooth convex optimization with complexity \(\mathcal {O}(\varepsilon ^{-1/2})\) ⋮ Regularized Newton Methods for Minimizing Functions with Hölder Continuous Hessians ⋮ An introduction to continuous optimization for imaging ⋮ Fast gradient descent for convex minimization problems with an oracle producing a \(( \delta, L)\)-model of function at the requested point ⋮ Complexity bounds for primal-dual methods minimizing the model of objective function ⋮ An accelerated directional derivative method for smooth stochastic convex optimization ⋮ Nearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approach ⋮ On the properties of the method of minimization for convex functions with relaxation on the distance to extremum ⋮ Regularized nonlinear acceleration ⋮ The backtrack Hölder gradient method with application to min-max and min-min problems ⋮ Gradient descent in the absence of global Lipschitz continuity of the gradients ⋮ Complexity-optimal and parameter-free first-order methods for finding stationary points of composite optimization problems ⋮ Decentralized and parallel primal and dual accelerated methods for stochastic convex programming problems ⋮ Sharpness, Restart, and Acceleration ⋮ Accelerated Bregman proximal gradient methods for relatively smooth convex optimization ⋮ Complementary composite minimization, small gradients in general norms, and applications ⋮ High-order methods beyond the classical complexity bounds: inexact high-order proximal-point methods ⋮ Essentials of numerical nonsmooth optimization ⋮ A block inertial Bregman proximal algorithm for nonsmooth nonconvex problems with application to symmetric nonnegative matrix tri-factorization ⋮ High-probability complexity bounds for non-smooth stochastic convex optimization with heavy-tailed noise ⋮ Universal nonmonotone line search method for nonconvex multiobjective optimization problems with convex constraints ⋮ Generalized Conditional Gradient with Augmented Lagrangian for Composite Minimization ⋮ Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity ⋮ Accelerated primal-dual gradient descent with linesearch for convex, nonconvex, and nonsmooth optimization problems ⋮ A family of subgradient-based methods for convex optimization problems in a unifying framework ⋮ Universal method of searching for equilibria and stochastic equilibria in transportation networks ⋮ Optimal subgradient methods: computational properties for large-scale linear inverse problems ⋮ Generalized uniformly optimal methods for nonlinear programming ⋮ A Subgradient Method for Free Material Design ⋮ Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent ⋮ Quasi-convex feasibility problems: subgradient methods and convergence rates ⋮ Generalized Nesterov's accelerated proximal gradient algorithms with convergence rate of order \(o(1/k^2)\) ⋮ Efficiency of minimizing compositions of convex functions and smooth maps ⋮ An adaptive proximal method for variational inequalities ⋮ Unified Acceleration of High-Order Algorithms under General Hölder Continuity ⋮ The method of codifferential descent for convex and global piecewise affine optimization ⋮ A dual approach for optimal algorithms in distributed optimization over networks ⋮ Essentials of numerical nonsmooth optimization ⋮ Inexact model: a framework for optimization and variational inequalities ⋮ Universal intermediate gradient method for convex problems with inexact oracle ⋮ Convex optimization with inexact gradients in Hilbert space and applications to elliptic inverse problems
Uses Software
Cites Work
- Unnamed Item
- Primal-dual subgradient methods for convex problems
- Smooth minimization of non-smooth functions
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Gradient methods for minimizing composite functions
- First-order methods of smooth convex optimization with inexact oracle
- Introductory lectures on convex optimization. A basic course.
- New variants of bundle methods
- Optimal methods of smooth convex minimization
- Vector approximation problems in geometric vector optimization
This page was built for publication: Universal gradient methods for convex optimization problems