Pages that link to "Item:Q507324"
From MaRDI portal
The following pages link to Smooth strongly convex interpolation and exact worst-case performance of first-order methods (Q507324):
Displaying 48 items.
- Optimized first-order methods for smooth convex minimization (Q312663) (← links)
- On the convergence analysis of the optimized gradient method (Q511969) (← links)
- On the convergence rate of the Halpern-iteration (Q828660) (← links)
- Exact worst-case convergence rates of the proximal gradient method for composite convex minimization (Q1670100) (← links)
- On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions (Q1679617) (← links)
- Analysis of biased stochastic gradient descent using sequential semidefinite programs (Q2020610) (← links)
- Optimizing the efficiency of first-order methods for decreasing the gradient of smooth convex functions (Q2026726) (← links)
- Bounds for the tracking error of first-order online optimization methods (Q2032000) (← links)
- Analysis of optimization algorithms via sum-of-squares (Q2046552) (← links)
- On the oracle complexity of smooth strongly convex minimization (Q2052164) (← links)
- Parallel random block-coordinate forward-backward algorithm: a unified convergence analysis (Q2133415) (← links)
- Optimal complexity and certification of Bregman first-order methods (Q2149545) (← links)
- Scaled relative graphs: nonexpansive operators via 2D Euclidean geometry (Q2149561) (← links)
- A frequency-domain analysis of inexact gradient methods (Q2149575) (← links)
- Synthesis of accelerated gradient algorithms for optimization and saddle point problems using Lyapunov functions and LMIs (Q2154842) (← links)
- A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives (Q2165600) (← links)
- Backward-forward-reflected-backward splitting for three operator monotone inclusions (Q2185395) (← links)
- Efficient first-order methods for convex minimization: a constructive approach (Q2205976) (← links)
- Accelerated methods for saddle-point problem (Q2214606) (← links)
- Accelerated proximal point method for maximally monotone operators (Q2235140) (← links)
- New analysis of linear convergence of gradient-type methods via unifying error bound conditions (Q2297652) (← links)
- Finding the forward-Douglas-Rachford-forward method (Q2302831) (← links)
- Finitely determined functions (Q2656197) (← links)
- Surrogate-based distributed optimisation for expensive black-box functions (Q2663899) (← links)
- The exact worst-case convergence rate of the gradient method with fixed step lengths for \(L\)-smooth functions (Q2673524) (← links)
- Halting time is predictable for large models: a universality property and average-case analysis (Q2697399) (← links)
- Tight Sublinear Convergence Rate of the Proximal Point Algorithm for Maximal Monotone Inclusion Problems (Q3300773) (← links)
- Generalizing the Optimized Gradient Method for Smooth Convex Minimization (Q4571883) (← links)
- Another Look at the Fast Iterative Shrinkage/Thresholding Algorithm (FISTA) (Q4603039) (← links)
- A generic online acceleration scheme for optimization algorithms via relaxation and inertia (Q4622890) (← links)
- Analysis of Optimization Algorithms via Integral Quadratic Constraints: Nonstrongly Convex Problems (Q4687235) (← links)
- (Q4999039) (← links)
- Potential Function-Based Framework for Minimizing Gradients in Convex and Min-Max Optimization (Q5093649) (← links)
- Worst-Case Convergence Analysis of Inexact Gradient and Newton Methods Through Semidefinite Programming Performance Estimation (Q5116548) (← links)
- Operator Splitting Performance Estimation: Tight Contraction Factors and Optimal Parameter Selection (Q5123997) (← links)
- Data-Driven Nonsmooth Optimization (Q5210515) (← links)
- Solving inverse problems using data-driven models (Q5230520) (← links)
- Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization (Q5275297) (← links)
- On the Properties of Convex Functions over Open Sets (Q5856375) (← links)
- An optimal gradient method for smooth strongly convex minimization (Q6038652) (← links)
- Factor-\(\sqrt{2}\) acceleration of accelerated gradient methods (Q6073850) (← links)
- Conditions for linear convergence of the gradient method for non-convex optimization (Q6097482) (← links)
- A Systematic Approach to Lyapunov Analyses of Continuous-Time Models in Convex Optimization (Q6116244) (← links)
- Branch-and-bound performance estimation programming: a unified methodology for constructing optimal optimization methods (Q6120850) (← links)
- An elementary approach to tight worst case complexity analysis of gradient based methods (Q6165581) (← links)
- Principled analyses and design of first-order methods with inexact proximal operators (Q6165584) (← links)
- Conic linear optimization for computer-assisted proofs. Abstracts from the workshop held April 10--16, 2022 (Q6170529) (← links)
- Structured \((\min ,+)\)-convolution and its applications for the shortest/closest vector and nonlinear knapsack problems (Q6181364) (← links)