NESUN
From MaRDI portal
Software:40447
swMATH28733MaRDI QIDQ40447FDOQ40447
Author name not available (Why is that?)
Cited In (50)
- Sharpness, Restart, and Acceleration
- The Approximate Duality Gap Technique: A Unified Theory of First-Order Methods
- A universal modification of the linear coupling method
- Optimal subgradient methods: computational properties for large-scale linear inverse problems
- Inexact model: a framework for optimization and variational inequalities
- Quasi-convex feasibility problems: subgradient methods and convergence rates
- Generalized Conditional Gradient with Augmented Lagrangian for Composite Minimization
- Optimal Affine-Invariant Smooth Minimization Algorithms
- On the properties of the method of minimization for convex functions with relaxation on the distance to extremum
- Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy
- Zeroth-order methods for noisy Hölder-gradient functions
- Universal intermediate gradient method for convex problems with inexact oracle
- Accelerated Extra-Gradient Descent: A Novel Accelerated First-Order Method
- On the Adaptivity of Stochastic Gradient-Based Optimization
- Convex optimization with inexact gradients in Hilbert space and applications to elliptic inverse problems
- An accelerated directional derivative method for smooth stochastic convex optimization
- Nearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approach
- Universal gradient methods for convex optimization problems
- Smoothness parameter of power of Euclidean norm
- New results on subgradient methods for strongly convex optimization problems with a unified analysis
- Unified Acceleration of High-Order Algorithms under General Hölder Continuity
- Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity
- The method of codifferential descent for convex and global piecewise affine optimization
- Efficiency of minimizing compositions of convex functions and smooth maps
- Decentralized and parallel primal and dual accelerated methods for stochastic convex programming problems
- Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
- A block inertial Bregman proximal algorithm for nonsmooth nonconvex problems with application to symmetric nonnegative matrix tri-factorization
- Generalized mirror prox algorithm for monotone variational inequalities: Universality and inexact oracle
- Implementable tensor methods in unconstrained convex optimization
- Universal method for stochastic composite optimization problems
- Regularized Newton Methods for Minimizing Functions with Hölder Continuous Hessians
- Universal method of searching for equilibria and stochastic equilibria in transportation networks
- An adaptive proximal method for variational inequalities
- Efficiency of the Accelerated Coordinate Descent Method on Structured Optimization Problems
- Adaptivity of Stochastic Gradient Methods for Nonconvex Optimization
- A dual approach for optimal algorithms in distributed optimization over networks
- Optimal subgradient algorithms for large-scale convex optimization in simple domains
- A subgradient method for free material design
- On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients
- Solving structured nonsmooth convex optimization with complexity \(\mathcal {O}(\varepsilon ^{-1/2})\)
- Accelerated schemes for a class of variational inequalities
- Fast gradient descent for convex minimization problems with an oracle producing a \(( \delta, L)\)-model of function at the requested point
- Stochastic Model-Based Minimization of Weakly Convex Functions
- An optimal subgradient algorithm with subspace search for costly convex optimization problems
- Fast gradient methods for uniformly convex and weakly smooth problems
- On the quality of first-order approximation of functions with Hölder continuous gradient
- Regularized nonlinear acceleration
- Empirical risk minimization: probabilistic complexity and stepsize strategy
- Primal–dual accelerated gradient methods with small-dimensional relaxation oracle
- Accelerated first-order methods for hyperbolic programming
This page was built for software: NESUN