The following pages link to NESUN (Q40447):
Displaying 34 items.
- On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients (Q315517) (← links)
- New results on subgradient methods for strongly convex optimization problems with a unified analysis (Q316174) (← links)
- Universal gradient methods for convex optimization problems (Q494332) (← links)
- Accelerated schemes for a class of variational inequalities (Q1680963) (← links)
- Optimal subgradient algorithms for large-scale convex optimization in simple domains (Q1689457) (← links)
- Accelerated first-order methods for hyperbolic programming (Q1717219) (← links)
- Universal method for stochastic composite optimization problems (Q1746349) (← links)
- Solving structured nonsmooth convex optimization with complexity \(\mathcal {O}(\varepsilon ^{-1/2})\) (Q1752352) (← links)
- On the quality of first-order approximation of functions with Hölder continuous gradient (Q1985266) (← links)
- An accelerated directional derivative method for smooth stochastic convex optimization (Q2029381) (← links)
- Nearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approach (Q2031939) (← links)
- Decentralized and parallel primal and dual accelerated methods for stochastic convex programming problems (Q2042418) (← links)
- Accelerated Bregman proximal gradient methods for relatively smooth convex optimization (Q2044481) (← links)
- A block inertial Bregman proximal algorithm for nonsmooth nonconvex problems with application to symmetric nonnegative matrix tri-factorization (Q2046565) (← links)
- Quasi-convex feasibility problems: subgradient methods and convergence rates (Q2076909) (← links)
- Convex optimization with inexact gradients in Hilbert space and applications to elliptic inverse problems (Q2117629) (← links)
- Generalized mirror prox algorithm for monotone variational inequalities: Universality and inexact oracle (Q2159456) (← links)
- Zeroth-order methods for noisy Hölder-gradient functions (Q2162695) (← links)
- Smoothness parameter of power of Euclidean norm (Q2178876) (← links)
- Implementable tensor methods in unconstrained convex optimization (Q2227532) (← links)
- Fast gradient descent for convex minimization problems with an oracle producing a \(( \delta, L)\)-model of function at the requested point (Q2278192) (← links)
- On the properties of the method of minimization for convex functions with relaxation on the distance to extremum (Q2287166) (← links)
- Regularized nonlinear acceleration (Q2288185) (← links)
- Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity (Q2311123) (← links)
- Universal method of searching for equilibria and stochastic equilibria in transportation networks (Q2314190) (← links)
- Optimal subgradient methods: computational properties for large-scale linear inverse problems (Q2315075) (← links)
- Efficiency of minimizing compositions of convex functions and smooth maps (Q2330660) (← links)
- An adaptive proximal method for variational inequalities (Q2332639) (← links)
- An optimal subgradient algorithm with subspace search for costly convex optimization problems (Q2415906) (← links)
- Empirical risk minimization: probabilistic complexity and stepsize strategy (Q2419551) (← links)
- Fast gradient methods for uniformly convex and weakly smooth problems (Q2673504) (← links)
- A Subgradient Method for Free Material Design (Q2832891) (← links)
- Efficiency of the Accelerated Coordinate Descent Method on Structured Optimization Problems (Q2957979) (← links)
- Accelerated Extra-Gradient Descent: A Novel Accelerated First-Order Method (Q4993286) (← links)