NESUN
From MaRDI portal
Software:40447
swMATH28733MaRDI QIDQ40447FDOQ40447
Author name not available (Why is that?)
Cited In (50)
- A universal modification of the linear coupling method
- Optimal subgradient methods: computational properties for large-scale linear inverse problems
- Adaptivity of stochastic gradient methods for nonconvex optimization
- Inexact model: a framework for optimization and variational inequalities
- Quasi-convex feasibility problems: subgradient methods and convergence rates
- Optimal Affine-Invariant Smooth Minimization Algorithms
- On the properties of the method of minimization for convex functions with relaxation on the distance to extremum
- Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy
- Zeroth-order methods for noisy Hölder-gradient functions
- Universal intermediate gradient method for convex problems with inexact oracle
- Convex optimization with inexact gradients in Hilbert space and applications to elliptic inverse problems
- Stochastic model-based minimization of weakly convex functions
- Efficiency of the accelerated coordinate descent method on structured optimization problems
- Sharpness, restart, and acceleration
- Primal-dual accelerated gradient methods with small-dimensional relaxation oracle
- An accelerated directional derivative method for smooth stochastic convex optimization
- Nearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approach
- Universal gradient methods for convex optimization problems
- Smoothness parameter of power of Euclidean norm
- New results on subgradient methods for strongly convex optimization problems with a unified analysis
- Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity
- The method of codifferential descent for convex and global piecewise affine optimization
- Efficiency of minimizing compositions of convex functions and smooth maps
- The approximate duality gap technique: a unified theory of first-order methods
- Decentralized and parallel primal and dual accelerated methods for stochastic convex programming problems
- Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
- A block inertial Bregman proximal algorithm for nonsmooth nonconvex problems with application to symmetric nonnegative matrix tri-factorization
- Generalized conditional gradient with augmented Lagrangian for composite minimization
- Generalized mirror prox algorithm for monotone variational inequalities: Universality and inexact oracle
- Implementable tensor methods in unconstrained convex optimization
- Universal method for stochastic composite optimization problems
- Universal method of searching for equilibria and stochastic equilibria in transportation networks
- An adaptive proximal method for variational inequalities
- On the adaptivity of stochastic gradient-based optimization
- Accelerated extra-gradient descent: a novel accelerated first-order method
- A dual approach for optimal algorithms in distributed optimization over networks
- Optimal subgradient algorithms for large-scale convex optimization in simple domains
- A subgradient method for free material design
- On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients
- Solving structured nonsmooth convex optimization with complexity \(\mathcal {O}(\varepsilon ^{-1/2})\)
- Accelerated schemes for a class of variational inequalities
- Fast gradient descent for convex minimization problems with an oracle producing a \(( \delta, L)\)-model of function at the requested point
- Regularized Newton methods for minimizing functions with Hölder continuous hessians
- An optimal subgradient algorithm with subspace search for costly convex optimization problems
- Unified acceleration of high-order algorithms under general Hölder continuity
- Fast gradient methods for uniformly convex and weakly smooth problems
- On the quality of first-order approximation of functions with Hölder continuous gradient
- Regularized nonlinear acceleration
- Empirical risk minimization: probabilistic complexity and stepsize strategy
- Accelerated first-order methods for hyperbolic programming
This page was built for software: NESUN