Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent
From MaRDI portal
Publication:4638051
DOI10.4230/LIPIcs.ITCS.2017.3zbMath1402.90209arXiv1407.1537OpenAlexW2964235750MaRDI QIDQ4638051
Lorenzo Orecchia, Zeyuan Allen Zhu
Publication date: 3 May 2018
Full work available at URL: https://arxiv.org/abs/1407.1537
Related Items (35)
The Walrasian equilibrium and centralized distributed optimization in terms of modern convex optimization methods on the example of resource allocation problem ⋮ Efficient numerical methods to solve sparse linear equations with application to PageRank ⋮ Optimized first-order methods for smooth convex minimization ⋮ MAGMA: Multilevel Accelerated Gradient Mirror Descent Algorithm for Large-Scale Convex Composite Minimization ⋮ An Accelerated Method for Derivative-Free Smooth Stochastic Convex Optimization ⋮ Accelerated Methods for NonConvex Optimization ⋮ Primal–dual accelerated gradient methods with small-dimensional relaxation oracle ⋮ Potential Function-Based Framework for Minimizing Gradients in Convex and Min-Max Optimization ⋮ Nearly linear-time packing and covering LP solvers. Nearly linear-time packing and covering LP solvers, achieving width-independence and \(=(1/\varepsilon)\)-convergence ⋮ Dual approaches to the minimization of strongly convex functionals with a simple structure under affine constraints ⋮ Factor-\(\sqrt{2}\) acceleration of accelerated gradient methods ⋮ Social welfare and profit maximization from revealed preferences ⋮ Optimistic optimisation of composite objective with exponentiated update ⋮ Direct nonlinear acceleration ⋮ No-regret dynamics in the Fenchel game: a unified framework for algorithmic convex optimization ⋮ Stochastic incremental mirror descent algorithms with Nesterov smoothing ⋮ The optimal dynamic regret for smoothed online convex optimization with squared \(l_2\) norm switching costs ⋮ An Optimal First Order Method Based on Optimal Quadratic Averaging ⋮ Optimal Affine-Invariant Smooth Minimization Algorithms ⋮ Convergence Rates of Proximal Gradient Methods via the Convex Conjugate ⋮ Convergence of first-order methods via the convex conjugate ⋮ The Approximate Duality Gap Technique: A Unified Theory of First-Order Methods ⋮ A universal modification of the linear coupling method ⋮ Preconditioned accelerated gradient descent methods for locally Lipschitz smooth objectives with applications to the solution of nonlinear PDEs ⋮ A version of the mirror descent method to solve variational inequalities ⋮ Proximal Methods for Sparse Optimal Scoring and Discriminant Analysis ⋮ Unnamed Item ⋮ Bounds for the tracking error of first-order online optimization methods ⋮ Accelerated gradient-free optimization methods with a non-Euclidean proximal operator ⋮ Accelerated directional search with non-Euclidean prox-structure ⋮ Fastest rates for stochastic mirror descent methods ⋮ Accelerated primal-dual gradient descent with linesearch for convex, nonconvex, and nonsmooth optimization problems ⋮ Unified Acceleration of High-Order Algorithms under General Hölder Continuity ⋮ Generalized Momentum-Based Methods: A Hamiltonian Perspective ⋮ Accelerating variance-reduced stochastic gradient methods
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Primal-dual subgradient methods for convex problems
- Smooth minimization of non-smooth functions
- Gradient methods for minimizing composite functions
- An optimal method for stochastic composite optimization
- Universal gradient methods for convex optimization problems
- Accelerating the cubic regularization of Newton's method on convex problems
- A primal-dual perspective of online learning algorithms
- Introductory lectures on convex optimization. A basic course.
- Adaptive restart for accelerated gradient schemes
- Lectures on Modern Convex Optimization
- Nearly-Linear Time Positive LP Solver with Faster Convergence Rate
- Online Learning and Online Convex Optimization
- Accelerated, Parallel, and Proximal Coordinate Descent
- Using Optimization to Obtain a Width-Independent, Parallel, Simpler, and Faster Positive SDP Solver
- Fast Approximation Algorithms for Fractional Packing and Covering Problems
- Logarithmic Regret Algorithms for Online Convex Optimization
- Using Optimization to Break the Epsilon Barrier: A Faster and Simpler Width-Independent Algorithm for Solving Positive Linear Programs in Parallel
- An Almost-Linear-Time Algorithm for Approximate Max Flow in Undirected Graphs, and its Multicommodity Generalizations
- Optimal Distributed Online Prediction using Mini-Batches
- Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
- Approximating the exponential, the lanczos method and an Õ(m)-time spectral algorithm for balanced separator
- A new approach to computing maximum flows using electrical flows
- Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
This page was built for publication: Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent