Universal gradient methods for convex optimization problems
From MaRDI portal
Recommendations
Cites work
- scientific article; zbMATH DE number 3790208 (Why is no real title available?)
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- First-order methods of smooth convex optimization with inexact oracle
- Gradient methods for minimizing composite functions
- Introductory lectures on convex optimization. A basic course.
- New variants of bundle methods
- Optimal methods of smooth convex minimization
- Primal-dual subgradient methods for convex problems
- Smooth minimization of non-smooth functions
- Vector approximation problems in geometric vector optimization
Cited in
(98)- Efficiency of minimizing compositions of convex functions and smooth maps
- Perturbed Fenchel duality and first-order methods
- On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients
- The approximate duality gap technique: a unified theory of first-order methods
- Convex optimization with inexact gradients in Hilbert space and applications to elliptic inverse problems
- Efficiency of the accelerated coordinate descent method on structured optimization problems
- Generalized mirror prox algorithm for monotone variational inequalities: Universality and inexact oracle
- Unified acceleration of high-order algorithms under general Hölder continuity
- Decentralized and parallel primal and dual accelerated methods for stochastic convex programming problems
- Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
- Solving structured nonsmooth convex optimization with complexity \(\mathcal {O}(\varepsilon ^{-1/2})\)
- First-order optimization algorithms via inertial systems with Hessian driven damping
- Fast gradient methods for uniformly convex and weakly smooth problems
- A block inertial Bregman proximal algorithm for nonsmooth nonconvex problems with application to symmetric nonnegative matrix tri-factorization
- Regularized nonlinear acceleration
- Universal intermediate gradient method for convex problems with inexact oracle
- Optimal subgradient methods: computational properties for large-scale linear inverse problems
- A simple nearly optimal restart scheme for speeding up first-order methods
- On the adaptivity of stochastic gradient-based optimization
- Adaptivity of stochastic gradient methods for nonconvex optimization
- Inexact model: a framework for optimization and variational inequalities
- A family of subgradient-based methods for convex optimization problems in a unifying framework
- An accelerated directional derivative method for smooth stochastic convex optimization
- Nearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approach
- Complexity bounds for primal-dual methods minimizing the model of objective function
- A universal modification of the linear coupling method
- Generalized Nesterov's accelerated proximal gradient algorithms with convergence rate of order \(o(1/k^2)\)
- Accelerated meta-algorithm for convex optimization problems
- An introduction to continuous optimization for imaging
- Recent advances in structural optimization
- Dynamic smoothness parameter for fast gradient methods
- Universal method of searching for equilibria and stochastic equilibria in transportation networks
- Sharpness, restart, and acceleration
- Regularized Newton methods for minimizing functions with Hölder continuous hessians
- Accelerated schemes for a class of variational inequalities
- Dual approaches to the minimization of strongly convex functionals with a simple structure under affine constraints
- On the quality of first-order approximation of functions with Hölder continuous gradient
- Accelerated first-order methods for hyperbolic programming
- Conditional gradient type methods for composite nonlinear and stochastic optimization
- An optimal subgradient algorithm with subspace search for costly convex optimization problems
- Stochastic model-based minimization of weakly convex functions
- Essentials of numerical nonsmooth optimization
- Linear coupling: an ultimate unification of gradient and mirror descent
- A dual approach for optimal algorithms in distributed optimization over networks
- OSGA: a fast subgradient algorithm with optimal complexity
- Optimal subgradient algorithms for large-scale convex optimization in simple domains
- Smoothness parameter of power of Euclidean norm
- Fast gradient descent for convex minimization problems with an oracle producing a \(( \delta, L)\)-model of function at the requested point
- An adaptive proximal method for variational inequalities
- On the properties of the method of minimization for convex functions with relaxation on the distance to extremum
- First-order methods for convex optimization
- New results on subgradient methods for strongly convex optimization problems with a unified analysis
- Universal method for stochastic composite optimization problems
- Implementable tensor methods in unconstrained convex optimization
- Generalized uniformly optimal methods for nonlinear programming
- Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity
- An optimal gradient method for smooth strongly convex minimization
- Accelerated primal-dual gradient descent with linesearch for convex, nonconvex, and nonsmooth optimization problems
- Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy
- Primal-dual accelerated gradient methods with small-dimensional relaxation oracle
- Efficient first-order methods for convex minimization: a constructive approach
- Quasi-convex feasibility problems: subgradient methods and convergence rates
- On the computational efficiency of subgradient methods: a case study with Lagrangian bounds
- Uniform rank gradient, cost, and local-global convergence
- Zeroth-order methods for noisy Hölder-gradient functions
- Multistage transportation model and sufficient conditions for its potentiality
- scientific article; zbMATH DE number 7726319 (Why is no real title available?)
- Optimal Affine-Invariant Smooth Minimization Algorithms
- High-probability complexity bounds for non-smooth stochastic convex optimization with heavy-tailed noise
- scientific article; zbMATH DE number 4041186 (Why is no real title available?)
- General Hölder smooth convergence rates follow from specialized rates assuming growth bounds
- Gradient-Type Methods for Optimization Problems with Polyak-Łojasiewicz Condition: Early Stopping and Adaptivity to Inexactness Parameter
- Complexity-optimal and parameter-free first-order methods for finding stationary points of composite optimization problems
- An adaptive analog of Nesterov's method for variational inequalities with a strongly monotone operator
- Gradient descent in the absence of global Lipschitz continuity of the gradients
- Generalized conditional gradient with augmented Lagrangian for composite minimization
- Empirical risk minimization: probabilistic complexity and stepsize strategy
- A universal accelerated primal-dual method for convex optimization problems
- Cyclic Coordinate Dual Averaging with Extrapolation
- scientific article; zbMATH DE number 7656028 (Why is no real title available?)
- Non-monotone Behavior of the Heavy Ball Method
- Accelerated extra-gradient descent: a novel accelerated first-order method
- Gradient methods with memory
- The impact of noise on evaluation complexity: the deterministic trust-region case
- Optimal Algorithms for Stochastic Complementary Composite Minimization
- Stopping rules for gradient methods for non-convex problems with additive noise in gradient
- The method of codifferential descent for convex and global piecewise affine optimization
- Radial duality. II: Applications and algorithms
- Inexact reduced gradient methods in nonconvex optimization
- The backtrack Hölder gradient method with application to min-max and min-min problems
- Some adaptive first-order methods for variational inequalities with relatively strongly monotone operators and generalized smoothness
- Essentials of numerical nonsmooth optimization
- Complementary composite minimization, small gradients in general norms, and applications
- High-order methods beyond the classical complexity bounds: inexact high-order proximal-point methods
- Universal nonmonotone line search method for nonconvex multiobjective optimization problems with convex constraints
- On optimal universal first-order methods for minimizing heterogeneous sums
- A subgradient method for free material design
- Universal Conditional Gradient Sliding for Convex Optimization
This page was built for publication: Universal gradient methods for convex optimization problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q494332)