The following pages link to NESUN (Q40447):
Displaying 50 items.
- On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients (Q315517) (← links)
- New results on subgradient methods for strongly convex optimization problems with a unified analysis (Q316174) (← links)
- Universal gradient methods for convex optimization problems (Q494332) (← links)
- Accelerated schemes for a class of variational inequalities (Q1680963) (← links)
- Optimal subgradient algorithms for large-scale convex optimization in simple domains (Q1689457) (← links)
- Accelerated first-order methods for hyperbolic programming (Q1717219) (← links)
- Universal method for stochastic composite optimization problems (Q1746349) (← links)
- Solving structured nonsmooth convex optimization with complexity \(\mathcal {O}(\varepsilon ^{-1/2})\) (Q1752352) (← links)
- On the quality of first-order approximation of functions with Hölder continuous gradient (Q1985266) (← links)
- An accelerated directional derivative method for smooth stochastic convex optimization (Q2029381) (← links)
- Nearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approach (Q2031939) (← links)
- Decentralized and parallel primal and dual accelerated methods for stochastic convex programming problems (Q2042418) (← links)
- Accelerated Bregman proximal gradient methods for relatively smooth convex optimization (Q2044481) (← links)
- A block inertial Bregman proximal algorithm for nonsmooth nonconvex problems with application to symmetric nonnegative matrix tri-factorization (Q2046565) (← links)
- Quasi-convex feasibility problems: subgradient methods and convergence rates (Q2076909) (← links)
- Convex optimization with inexact gradients in Hilbert space and applications to elliptic inverse problems (Q2117629) (← links)
- Generalized mirror prox algorithm for monotone variational inequalities: Universality and inexact oracle (Q2159456) (← links)
- Zeroth-order methods for noisy Hölder-gradient functions (Q2162695) (← links)
- Smoothness parameter of power of Euclidean norm (Q2178876) (← links)
- Implementable tensor methods in unconstrained convex optimization (Q2227532) (← links)
- Fast gradient descent for convex minimization problems with an oracle producing a \(( \delta, L)\)-model of function at the requested point (Q2278192) (← links)
- On the properties of the method of minimization for convex functions with relaxation on the distance to extremum (Q2287166) (← links)
- Regularized nonlinear acceleration (Q2288185) (← links)
- Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity (Q2311123) (← links)
- Universal method of searching for equilibria and stochastic equilibria in transportation networks (Q2314190) (← links)
- Optimal subgradient methods: computational properties for large-scale linear inverse problems (Q2315075) (← links)
- Efficiency of minimizing compositions of convex functions and smooth maps (Q2330660) (← links)
- An adaptive proximal method for variational inequalities (Q2332639) (← links)
- An optimal subgradient algorithm with subspace search for costly convex optimization problems (Q2415906) (← links)
- Empirical risk minimization: probabilistic complexity and stepsize strategy (Q2419551) (← links)
- Fast gradient methods for uniformly convex and weakly smooth problems (Q2673504) (← links)
- A Subgradient Method for Free Material Design (Q2832891) (← links)
- Efficiency of the Accelerated Coordinate Descent Method on Structured Optimization Problems (Q2957979) (← links)
- Stochastic Model-Based Minimization of Weakly Convex Functions (Q4620418) (← links)
- Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy (Q4629334) (← links)
- The Approximate Duality Gap Technique: A Unified Theory of First-Order Methods (Q4629338) (← links)
- A universal modification of the linear coupling method (Q4631767) (← links)
- Generalized Conditional Gradient with Augmented Lagrangian for Composite Minimization (Q4971021) (← links)
- Accelerated Extra-Gradient Descent: A Novel Accelerated First-Order Method (Q4993286) (← links)
- Unified Acceleration of High-Order Algorithms under General Hölder Continuity (Q5003214) (← links)
- Adaptivity of Stochastic Gradient Methods for Nonconvex Optimization (Q5076671) (← links)
- Primal–dual accelerated gradient methods with small-dimensional relaxation oracle (Q5085262) (← links)
- On the Adaptivity of Stochastic Gradient-Based Optimization (Q5114394) (← links)
- Sharpness, Restart, and Acceleration (Q5210521) (← links)
- Optimal Affine-Invariant Smooth Minimization Algorithms (Q5376450) (← links)
- Regularized Newton Methods for Minimizing Functions with Hölder Continuous Hessians (Q5737717) (← links)
- The method of codifferential descent for convex and global piecewise affine optimization (Q5859002) (← links)
- A dual approach for optimal algorithms in distributed optimization over networks (Q5859014) (← links)
- Inexact model: a framework for optimization and variational inequalities (Q5865338) (← links)
- Universal intermediate gradient method for convex problems with inexact oracle (Q5865342) (← links)