Optimized first-order methods for smooth convex minimization
From MaRDI portal
Publication:312663
DOI10.1007/s10107-015-0949-3zbMath1345.90113arXiv1406.5468OpenAlexW1547678669WikidataQ41480270 ScholiaQ41480270MaRDI QIDQ312663
Donghwan Kim, Jeffrey A. Fessler
Publication date: 16 September 2016
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1406.5468
Analysis of algorithms and problem complexity (68Q25) Semidefinite programming (90C22) Convex programming (90C25) Abstract computational complexity for mathematical programming problems (90C60) Quadratic programming (90C20) Discrete approximations in optimal control (49M25)
Related Items
Variational phase recovering without phase unwrapping in phase-shifting interferometry, On the Complexity Analysis of the Primal Solutions for the Accelerated Randomized Dual Coordinate Ascent, Optimized first-order methods for smooth convex minimization, Optimal complexity and certification of Bregman first-order methods, A frequency-domain analysis of inexact gradient methods, An optimal variant of Kelley's cutting-plane method, Generalizing the Optimized Gradient Method for Smooth Convex Minimization, Optimal deterministic algorithm generation, Adaptive restart of the optimized gradient method for convex optimization, Exact worst-case convergence rates of the proximal gradient method for composite convex minimization, Fast proximal algorithms for nonsmooth convex optimization, Accelerated additive Schwarz methods for convex optimization with adaptive restart, Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity, From the Ravine Method to the Nesterov Method and Vice Versa: A Dynamical System Perspective, Fast gradient methods for uniformly convex and weakly smooth problems, An optimal gradient method for smooth strongly convex minimization, Convergence of iterates for first-order optimization algorithms with inertia and Hessian driven damping, Convergence rate of inertial proximal algorithms with general extrapolation and proximal coefficients, Factor-\(\sqrt{2}\) acceleration of accelerated gradient methods, Unnamed Item, Rate of convergence of the Nesterov accelerated gradient method in the subcritical case α ≤ 3, Branch-and-bound performance estimation programming: a unified methodology for constructing optimal optimization methods, Unnamed Item, Convergence rate of a relaxed inertial proximal algorithm for convex minimization, Principled analyses and design of first-order methods with inexact proximal operators, Conic linear optimization for computer-assisted proofs. Abstracts from the workshop held April 10--16, 2022, From Halpern's fixed-point iterations to Nesterov's accelerated interpretations for root-finding problems, Another Look at the Fast Iterative Shrinkage/Thresholding Algorithm (FISTA), Worst-Case Convergence Analysis of Inexact Gradient and Newton Methods Through Semidefinite Programming Performance Estimation, Efficient first-order methods for convex minimization: a constructive approach, Convergence Rates of Inertial Forward-Backward Algorithms, Operator Splitting Performance Estimation: Tight Contraction Factors and Optimal Parameter Selection, Properties of the sign gradient descent algorithms, Inertial forward-backward algorithms with perturbations: application to Tikhonov regularization, Effects of dominance on operation policies in a two-stage supply chain in which market demands follow the Bass diffusion model, Tight Sublinear Convergence Rate of the Proximal Point Algorithm for Maximal Monotone Inclusion Problems, Convergence of inertial dynamics and proximal algorithms governed by maximally monotone operators, Accelerated proximal point method for maximally monotone operators, Smooth strongly convex interpolation and exact worst-case performance of first-order methods, The exact information-based complexity of smooth convex minimization, On the convergence analysis of the optimized gradient method, Optimizing the efficiency of first-order methods for decreasing the gradient of smooth convex functions, Analysis of Optimization Algorithms via Integral Quadratic Constraints: Nonstrongly Convex Problems, Bounds for the tracking error of first-order online optimization methods, Decentralized and parallel primal and dual accelerated methods for stochastic convex programming problems, Data-Driven Nonsmooth Optimization, Analysis of optimization algorithms via sum-of-squares, Convergence rate of inertial forward-backward algorithm beyond Nesterov's rule, New analysis of linear convergence of gradient-type methods via unifying error bound conditions, The Rate of Convergence of Nesterov's Accelerated Forward-Backward Method is Actually Faster Than $1/k^2$, Solving inverse problems using data-driven models, Generalized Nesterov's accelerated proximal gradient algorithms with convergence rate of order \(o(1/k^2)\), Convergence of a relaxed inertial forward-backward algorithm for structured monotone inclusions, Robust inference for skewed data in health sciences, Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization
Uses Software
Cites Work
- Unnamed Item
- Smooth minimization of non-smooth functions
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Optimized first-order methods for smooth convex minimization
- Gradient methods for minimizing composite functions
- Smooth strongly convex interpolation and exact worst-case performance of first-order methods
- Approximation accuracy, gradient methods, and error bound for structured convex optimization
- Introductory lectures on convex optimization. A basic course.
- Performance of first-order methods for smooth convex minimization: a novel approach
- Adaptive restart for accelerated gradient schemes
- A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
- Graph Implementations for Nonsmooth Convex Programs
- Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent
- Some methods of speeding up the convergence of iteration methods