PESTO
From MaRDI portal
Software:32678
No author found.
Source code repository: https://github.com/AdrienTaylor/Performance-Estimation-Toolbox
Related Items (41)
On the convergence rate of the Halpern-iteration ⋮ Parallel random block-coordinate forward-backward algorithm: a unified convergence analysis ⋮ Optimized first-order methods for smooth convex minimization ⋮ Optimal complexity and certification of Bregman first-order methods ⋮ Scaled relative graphs: nonexpansive operators via 2D Euclidean geometry ⋮ A frequency-domain analysis of inexact gradient methods ⋮ On the Properties of Convex Functions over Open Sets ⋮ Finitely determined functions ⋮ Generalizing the Optimized Gradient Method for Smooth Convex Minimization ⋮ Exact worst-case convergence rates of the proximal gradient method for composite convex minimization ⋮ Surrogate-based distributed optimisation for expensive black-box functions ⋮ On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions ⋮ The exact worst-case convergence rate of the gradient method with fixed step lengths for \(L\)-smooth functions ⋮ Backward-forward-reflected-backward splitting for three operator monotone inclusions ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Another Look at the Fast Iterative Shrinkage/Thresholding Algorithm (FISTA) ⋮ Worst-Case Convergence Analysis of Inexact Gradient and Newton Methods Through Semidefinite Programming Performance Estimation ⋮ Efficient first-order methods for convex minimization: a constructive approach ⋮ Halting time is predictable for large models: a universality property and average-case analysis ⋮ Operator Splitting Performance Estimation: Tight Contraction Factors and Optimal Parameter Selection ⋮ Accelerated methods for saddle-point problem ⋮ A generic online acceleration scheme for optimization algorithms via relaxation and inertia ⋮ Tight Sublinear Convergence Rate of the Proximal Point Algorithm for Maximal Monotone Inclusion Problems ⋮ Smooth strongly convex interpolation and exact worst-case performance of first-order methods ⋮ The exact information-based complexity of smooth convex minimization ⋮ On the convergence analysis of the optimized gradient method ⋮ Analysis of biased stochastic gradient descent using sequential semidefinite programs ⋮ Fast gradient descent for convex minimization problems with an oracle producing a \(( \delta, L)\)-model of function at the requested point ⋮ Optimizing the efficiency of first-order methods for decreasing the gradient of smooth convex functions ⋮ Analysis of Optimization Algorithms via Integral Quadratic Constraints: Nonstrongly Convex Problems ⋮ Bounds for the tracking error of first-order online optimization methods ⋮ Data-Driven Nonsmooth Optimization ⋮ Analysis of the gradient method with an Armijo–Wolfe line search on a class of non-smooth convex functions ⋮ Analysis of optimization algorithms via sum-of-squares ⋮ New analysis of linear convergence of gradient-type methods via unifying error bound conditions ⋮ On the oracle complexity of smooth strongly convex minimization ⋮ Finding the forward-Douglas-Rachford-forward method ⋮ Solving inverse problems using data-driven models ⋮ Optimal step length for the Newton method: case of self-concordant functions ⋮ Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization
This page was built for software: PESTO