Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent

From MaRDI portal
Publication:4638051

DOI10.4230/LIPIcs.ITCS.2017.3zbMath1402.90209arXiv1407.1537OpenAlexW2964235750MaRDI QIDQ4638051

Lorenzo Orecchia, Zeyuan Allen Zhu

Publication date: 3 May 2018

Full work available at URL: https://arxiv.org/abs/1407.1537




Related Items (35)

The Walrasian equilibrium and centralized distributed optimization in terms of modern convex optimization methods on the example of resource allocation problemEfficient numerical methods to solve sparse linear equations with application to PageRankOptimized first-order methods for smooth convex minimizationMAGMA: Multilevel Accelerated Gradient Mirror Descent Algorithm for Large-Scale Convex Composite MinimizationAn Accelerated Method for Derivative-Free Smooth Stochastic Convex OptimizationAccelerated Methods for NonConvex OptimizationPrimal–dual accelerated gradient methods with small-dimensional relaxation oraclePotential Function-Based Framework for Minimizing Gradients in Convex and Min-Max OptimizationNearly linear-time packing and covering LP solvers. Nearly linear-time packing and covering LP solvers, achieving width-independence and \(=(1/\varepsilon)\)-convergenceDual approaches to the minimization of strongly convex functionals with a simple structure under affine constraintsFactor-\(\sqrt{2}\) acceleration of accelerated gradient methodsSocial welfare and profit maximization from revealed preferencesOptimistic optimisation of composite objective with exponentiated updateDirect nonlinear accelerationNo-regret dynamics in the Fenchel game: a unified framework for algorithmic convex optimizationStochastic incremental mirror descent algorithms with Nesterov smoothingThe optimal dynamic regret for smoothed online convex optimization with squared \(l_2\) norm switching costsAn Optimal First Order Method Based on Optimal Quadratic AveragingOptimal Affine-Invariant Smooth Minimization AlgorithmsConvergence Rates of Proximal Gradient Methods via the Convex ConjugateConvergence of first-order methods via the convex conjugateThe Approximate Duality Gap Technique: A Unified Theory of First-Order MethodsA universal modification of the linear coupling methodPreconditioned accelerated gradient descent methods for locally Lipschitz smooth objectives with applications to the solution of nonlinear PDEsA version of the mirror descent method to solve variational inequalitiesProximal Methods for Sparse Optimal Scoring and Discriminant AnalysisUnnamed ItemBounds for the tracking error of first-order online optimization methodsAccelerated gradient-free optimization methods with a non-Euclidean proximal operatorAccelerated directional search with non-Euclidean prox-structureFastest rates for stochastic mirror descent methodsAccelerated primal-dual gradient descent with linesearch for convex, nonconvex, and nonsmooth optimization problemsUnified Acceleration of High-Order Algorithms under General Hölder ContinuityGeneralized Momentum-Based Methods: A Hamiltonian PerspectiveAccelerating variance-reduced stochastic gradient methods



Cites Work


This page was built for publication: Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent