Factor-\(\sqrt{2}\) acceleration of accelerated gradient methods (Q6073850): Difference between revisions

From MaRDI portal
Set OpenAlex properties.
ReferenceBot (talk | contribs)
Changed an Item
Property / cites work
 
Property / cites work: Katyusha: the first direct acceleration of stochastic gradient methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent / rank
 
Normal rank
Property / cites work
 
Property / cites work: Using Optimization to Obtain a Width-Independent, Parallel, Simpler, and Faster Positive SDP Solver / rank
 
Normal rank
Property / cites work
 
Property / cites work: Optimal Convergence Rates for Nesterov Acceleration / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convergence Rates of the Heavy Ball Method for Quasi-strongly Convex Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Interior Gradient and Proximal Methods for Convex and Conic Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5204822 / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications / rank
 
Normal rank
Property / cites work
 
Property / cites work: Mirror descent and nonlinear projected subgradient methods for convex optimization. / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Worst-Case Convergence Analysis of Inexact Gradient and Newton Methods Through Semidefinite Programming Performance Estimation / rank
 
Normal rank
Property / cites work
 
Property / cites work: The exact information-based complexity of smooth convex minimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Efficient first-order methods for convex minimization: a constructive approach / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the oracle complexity of smooth strongly convex minimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Performance of first-order methods for smooth convex minimization: a novel approach / rank
 
Normal rank
Property / cites work
 
Property / cites work: Accelerated gradient methods for nonconvex nonlinear and stochastic programming / rank
 
Normal rank
Property / cites work
 
Property / cites work: Tight Sublinear Convergence Rate of the Proximal Point Algorithm for Maximal Monotone Inclusion Problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Accelerated proximal point method for maximally monotone operators / rank
 
Normal rank
Property / cites work
 
Property / cites work: Optimized first-order methods for smooth convex minimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the convergence analysis of the optimized gradient method / rank
 
Normal rank
Property / cites work
 
Property / cites work: Adaptive restart of the optimized gradient method for convex optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Another Look at the Fast Iterative Shrinkage/Thresholding Algorithm (FISTA) / rank
 
Normal rank
Property / cites work
 
Property / cites work: Generalizing the Optimized Gradient Method for Smooth Convex Minimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Analysis and Design of Optimization Algorithms via Integral Quadratic Constraints / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the convergence rate of the Halpern-iteration / rank
 
Normal rank
Property / cites work
 
Property / cites work: Relatively Smooth Convex Optimization by First-Order Methods, and Applications / rank
 
Normal rank
Property / cites work
 
Property / cites work: On optimality of Krylov's information when solving linear operator equations / rank
 
Normal rank
Property / cites work
 
Property / cites work: Information-based complexity of linear operator equations / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3967358 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Introductory lectures on convex optimization. A basic course. / rank
 
Normal rank
Property / cites work
 
Property / cites work: Smooth minimization of non-smooth functions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Accelerating the cubic regularization of Newton's method on convex problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Primal-dual subgradient methods for convex problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Efficiency of the Accelerated Coordinate Descent Method on Structured Optimization Problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convex Analysis / rank
 
Normal rank
Property / cites work
 
Property / cites work: Large-Scale Convex Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Operator Splitting Performance Estimation: Tight Contraction Factors and Optimal Parameter Selection / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights / rank
 
Normal rank
Property / cites work
 
Property / cites work: An optimal gradient method for smooth strongly convex minimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Smooth strongly convex interpolation and exact worst-case performance of first-order methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: A variational perspective on accelerated methods in optimization / rank
 
Normal rank

Revision as of 23:10, 2 August 2024

scientific article; zbMATH DE number 7739286
Language Label Description Also known as
English
Factor-\(\sqrt{2}\) acceleration of accelerated gradient methods
scientific article; zbMATH DE number 7739286

    Statements

    Factor-\(\sqrt{2}\) acceleration of accelerated gradient methods (English)
    0 references
    0 references
    0 references
    0 references
    18 September 2023
    0 references
    convex optimization
    0 references
    first-order methods
    0 references
    acceleration
    0 references
    0 references
    0 references
    0 references

    Identifiers