PESTO
From MaRDI portal
Software:32678
swMATH20864MaRDI QIDQ32678FDOQ32678
Author name not available (Why is that?)
Source code repository: https://github.com/AdrienTaylor/Performance-Estimation-Toolbox
Cited In (41)
- On the properties of convex functions over open sets
- Analysis of the gradient method with an Armijo-Wolfe line search on a class of non-smooth convex functions
- Tight sublinear convergence rate of the proximal point algorithm for maximal monotone inclusion problems
- On the oracle complexity of smooth strongly convex minimization
- Data-driven nonsmooth optimization
- Another look at the fast iterative shrinkage/thresholding algorithm (FISTA)
- Generalizing the optimized gradient method for smooth convex minimization
- Operator splitting performance estimation: tight contraction factors and optimal parameter selection
- Title not available (Why is that?)
- Exact worst-case performance of first-order methods for composite convex optimization
- Backward-forward-reflected-backward splitting for three operator monotone inclusions
- Bounds for the tracking error of first-order online optimization methods
- Finding the forward-Douglas-Rachford-forward method
- Parallel random block-coordinate forward-backward algorithm: a unified convergence analysis
- Optimized first-order methods for smooth convex minimization
- Efficient first-order methods for convex minimization: a constructive approach
- A frequency-domain analysis of inexact gradient methods
- Optimal complexity and certification of Bregman first-order methods
- Scaled relative graphs: nonexpansive operators via 2D Euclidean geometry
- Smooth strongly convex interpolation and exact worst-case performance of first-order methods
- New analysis of linear convergence of gradient-type methods via unifying error bound conditions
- Analysis of optimization algorithms via integral quadratic constraints: nonstrongly convex problems
- Worst-case convergence analysis of inexact gradient and Newton methods through semidefinite programming performance estimation
- Exact worst-case convergence rates of the proximal gradient method for composite convex minimization
- Analysis of optimization algorithms via sum-of-squares
- On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions
- Optimizing the efficiency of first-order methods for decreasing the gradient of smooth convex functions
- Title not available (Why is that?)
- The exact worst-case convergence rate of the gradient method with fixed step lengths for \(L\)-smooth functions
- Optimal step length for the Newton method: case of self-concordant functions
- Accelerated methods for saddle-point problem
- Fast gradient descent for convex minimization problems with an oracle producing a \(( \delta, L)\)-model of function at the requested point
- Surrogate-based distributed optimisation for expensive black-box functions
- A generic online acceleration scheme for optimization algorithms via relaxation and inertia
- The exact information-based complexity of smooth convex minimization
- On the convergence analysis of the optimized gradient method
- Solving inverse problems using data-driven models
- Halting time is predictable for large models: a universality property and average-case analysis
- On the convergence rate of the Halpern-iteration
- Finitely determined functions
- Analysis of biased stochastic gradient descent using sequential semidefinite programs
This page was built for software: PESTO