NESUN
From MaRDI portal
Software:40447
No author found.
Related Items (50)
On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients ⋮ New results on subgradient methods for strongly convex optimization problems with a unified analysis ⋮ Generalized mirror prox algorithm for monotone variational inequalities: Universality and inexact oracle ⋮ Zeroth-order methods for noisy Hölder-gradient functions ⋮ Adaptivity of Stochastic Gradient Methods for Nonconvex Optimization ⋮ Accelerated Extra-Gradient Descent: A Novel Accelerated First-Order Method ⋮ Primal–dual accelerated gradient methods with small-dimensional relaxation oracle ⋮ Smoothness parameter of power of Euclidean norm ⋮ Accelerated schemes for a class of variational inequalities ⋮ Fast gradient methods for uniformly convex and weakly smooth problems ⋮ An optimal subgradient algorithm with subspace search for costly convex optimization problems ⋮ Empirical risk minimization: probabilistic complexity and stepsize strategy ⋮ Optimal subgradient algorithms for large-scale convex optimization in simple domains ⋮ Efficiency of the Accelerated Coordinate Descent Method on Structured Optimization Problems ⋮ On the Adaptivity of Stochastic Gradient-Based Optimization ⋮ Optimal Affine-Invariant Smooth Minimization Algorithms ⋮ Accelerated first-order methods for hyperbolic programming ⋮ Stochastic Model-Based Minimization of Weakly Convex Functions ⋮ Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy ⋮ The Approximate Duality Gap Technique: A Unified Theory of First-Order Methods ⋮ A universal modification of the linear coupling method ⋮ Implementable tensor methods in unconstrained convex optimization ⋮ Universal gradient methods for convex optimization problems ⋮ On the quality of first-order approximation of functions with Hölder continuous gradient ⋮ Universal method for stochastic composite optimization problems ⋮ Solving structured nonsmooth convex optimization with complexity \(\mathcal {O}(\varepsilon ^{-1/2})\) ⋮ Regularized Newton Methods for Minimizing Functions with Hölder Continuous Hessians ⋮ Fast gradient descent for convex minimization problems with an oracle producing a \(( \delta, L)\)-model of function at the requested point ⋮ An accelerated directional derivative method for smooth stochastic convex optimization ⋮ Nearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approach ⋮ On the properties of the method of minimization for convex functions with relaxation on the distance to extremum ⋮ Regularized nonlinear acceleration ⋮ Decentralized and parallel primal and dual accelerated methods for stochastic convex programming problems ⋮ Sharpness, Restart, and Acceleration ⋮ Accelerated Bregman proximal gradient methods for relatively smooth convex optimization ⋮ A block inertial Bregman proximal algorithm for nonsmooth nonconvex problems with application to symmetric nonnegative matrix tri-factorization ⋮ Generalized Conditional Gradient with Augmented Lagrangian for Composite Minimization ⋮ Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity ⋮ Universal method of searching for equilibria and stochastic equilibria in transportation networks ⋮ Optimal subgradient methods: computational properties for large-scale linear inverse problems ⋮ A Subgradient Method for Free Material Design ⋮ Quasi-convex feasibility problems: subgradient methods and convergence rates ⋮ Efficiency of minimizing compositions of convex functions and smooth maps ⋮ An adaptive proximal method for variational inequalities ⋮ Unified Acceleration of High-Order Algorithms under General Hölder Continuity ⋮ The method of codifferential descent for convex and global piecewise affine optimization ⋮ A dual approach for optimal algorithms in distributed optimization over networks ⋮ Inexact model: a framework for optimization and variational inequalities ⋮ Universal intermediate gradient method for convex problems with inexact oracle ⋮ Convex optimization with inexact gradients in Hilbert space and applications to elliptic inverse problems
This page was built for software: NESUN