An optimal first order method based on optimal quadratic averaging
From MaRDI portal
Publication:4603040
DOI10.1137/16M1072528zbMATH Open1382.65169arXiv1604.06543OpenAlexW2964093568MaRDI QIDQ4603040FDOQ4603040
Authors: Maryam Fazel, Scott Roy, D. Drusvyatskiy
Publication date: 14 February 2018
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Abstract: In a recent paper, Bubeck, Lee, and Singh introduced a new first order method for minimizing smooth strongly convex functions. Their geometric descent algorithm, largely inspired by the ellipsoid method, enjoys the optimal linear rate of convergence. We show that the same iterate sequence is generated by a scheme that in each iteration computes an optimal average of quadratic lower-models of the function. Indeed, the minimum of the averaged quadratic approaches the true minimum at an optimal rate. This intuitive viewpoint reveals clear connections to the original fast-gradient methods and cutting plane ideas, and leads to limited-memory extensions with improved performance.
Full work available at URL: https://arxiv.org/abs/1604.06543
Recommendations
- Optimized first-order methods for smooth convex minimization
- Optimizing first-order methods for smooth convex minimization of gradient Q-linearly convergence
- An optimal gradient method for smooth strongly convex minimization
- An acceleration procedure for optimal first-order methods
- On the convergence analysis of the optimized gradient method
convergencestrong convexityfirst-order methodellipsoid methodconvex quadraticaccelerated gradient methodgeometric descent algorithm
Cites Work
- Title not available (Why is that?)
- Introductory lectures on convex optimization. A basic course.
- Analysis and design of optimization algorithms via integral quadratic constraints
- Linear coupling: an ultimate unification of gradient and mirror descent
- The Cutting-Plane Method for Solving Convex Programs
- On the convexity of a class of quadratic mappings and its application to the problem of finding the smallest ball enclosing a given intersection of balls
- Fast convex optimization via inertial dynamics with Hessian driven damping
Cited In (15)
- Potential Function-Based Framework for Minimizing Gradients in Convex and Min-Max Optimization
- Efficient first-order methods for convex minimization: a constructive approach
- The condition number of a function relative to a set
- The approximate duality gap technique: a unified theory of first-order methods
- An acceleration procedure for optimal first-order methods
- Fast and safe: accelerated gradient methods with optimality certificates and underestimate sequences
- Convergence rates of proximal gradient methods via the convex conjugate
- Gradient methods with memory
- Understanding the acceleration phenomenon via high-resolution differential equations
- Generalized momentum-based methods: a Hamiltonian perspective
- Perturbed Fenchel duality and first-order methods
- Generalized Nesterov's accelerated proximal gradient algorithms with convergence rate of order \(o(1/k^2)\)
- The common-directions method for regularized empirical risk minimization
- Exact gradient methods with memory
- No-regret dynamics in the Fenchel game: a unified framework for algorithmic convex optimization
Uses Software
This page was built for publication: An optimal first order method based on optimal quadratic averaging
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4603040)