Pages that link to "Item:Q2248759"
From MaRDI portal
The following pages link to Performance of first-order methods for smooth convex minimization: a novel approach (Q2248759):
Displaying 50 items.
- Optimized first-order methods for smooth convex minimization (Q312663) (← links)
- An optimal variant of Kelley's cutting-plane method (Q344947) (← links)
- Smooth strongly convex interpolation and exact worst-case performance of first-order methods (Q507324) (← links)
- The exact information-based complexity of smooth convex minimization (Q511109) (← links)
- On the convergence analysis of the optimized gradient method (Q511969) (← links)
- Accelerated proximal algorithms with a correction term for monotone inclusions (Q832632) (← links)
- iPiasco: inertial proximal algorithm for strongly convex optimization (Q890114) (← links)
- Optimal deterministic algorithm generation (Q1668803) (← links)
- Adaptive restart of the optimized gradient method for convex optimization (Q1670019) (← links)
- Exact worst-case convergence rates of the proximal gradient method for composite convex minimization (Q1670100) (← links)
- On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions (Q1679617) (← links)
- Rate of convergence of inertial gradient dynamics with time-dependent viscous damping coefficient (Q1711864) (← links)
- Cyclic schemes for PDE-based image analysis (Q1991489) (← links)
- Analysis of biased stochastic gradient descent using sequential semidefinite programs (Q2020610) (← links)
- Optimizing the efficiency of first-order methods for decreasing the gradient of smooth convex functions (Q2026726) (← links)
- Bounds for the tracking error of first-order online optimization methods (Q2032000) (← links)
- A stochastic subspace approach to gradient-free optimization in high dimensions (Q2044475) (← links)
- Analysis of optimization algorithms via sum-of-squares (Q2046552) (← links)
- Optimal complexity and certification of Bregman first-order methods (Q2149545) (← links)
- A frequency-domain analysis of inexact gradient methods (Q2149575) (← links)
- Synthesis of accelerated gradient algorithms for optimization and saddle point problems using Lyapunov functions and LMIs (Q2154842) (← links)
- Backward-forward-reflected-backward splitting for three operator monotone inclusions (Q2185395) (← links)
- Efficient first-order methods for convex minimization: a constructive approach (Q2205976) (← links)
- Accelerated methods for saddle-point problem (Q2214606) (← links)
- Accelerated proximal point method for maximally monotone operators (Q2235140) (← links)
- Regularized nonlinear acceleration (Q2288185) (← links)
- New analysis of linear convergence of gradient-type methods via unifying error bound conditions (Q2297652) (← links)
- Finding the forward-Douglas-Rachford-forward method (Q2302831) (← links)
- Fast proximal algorithms for nonsmooth convex optimization (Q2661564) (← links)
- Accelerated additive Schwarz methods for convex optimization with adaptive restart (Q2666023) (← links)
- Fast gradient methods for uniformly convex and weakly smooth problems (Q2673504) (← links)
- The exact worst-case convergence rate of the gradient method with fixed step lengths for \(L\)-smooth functions (Q2673524) (← links)
- Inertial Proximal Alternating Linearized Minimization (iPALM) for Nonconvex and Nonsmooth Problems (Q3179622) (← links)
- Tight Sublinear Convergence Rate of the Proximal Point Algorithm for Maximal Monotone Inclusion Problems (Q3300773) (← links)
- Analysis and Design of Optimization Algorithms via Integral Quadratic Constraints (Q3465237) (← links)
- Generalizing the Optimized Gradient Method for Smooth Convex Minimization (Q4571883) (← links)
- Another Look at the Fast Iterative Shrinkage/Thresholding Algorithm (FISTA) (Q4603039) (← links)
- Analysis of Optimization Algorithms via Integral Quadratic Constraints: Nonstrongly Convex Problems (Q4687235) (← links)
- (Q4995610) (← links)
- (Q4999082) (← links)
- Robust and structure exploiting optimisation algorithms: an integral quadratic constraint approach (Q5012635) (← links)
- An Optimal High-Order Tensor Method for Convex Optimization (Q5026443) (← links)
- Gradient descent technology for sparse vector learning in ontology algorithms (Q5069813) (← links)
- Fast convergence of generalized forward-backward algorithms for structured monotone inclusions (Q5091986) (← links)
- Potential Function-Based Framework for Minimizing Gradients in Convex and Min-Max Optimization (Q5093649) (← links)
- Rate of convergence of the Nesterov accelerated gradient method in the subcritical case <i>α</i> ≤ 3 (Q5107904) (← links)
- Convergence rate of a relaxed inertial proximal algorithm for convex minimization (Q5110325) (← links)
- Worst-Case Convergence Analysis of Inexact Gradient and Newton Methods Through Semidefinite Programming Performance Estimation (Q5116548) (← links)
- Operator Splitting Performance Estimation: Tight Contraction Factors and Optimal Parameter Selection (Q5123997) (← links)
- Data-Driven Nonsmooth Optimization (Q5210515) (← links)