Pages that link to "Item:Q4638051"
From MaRDI portal
The following pages link to Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent (Q4638051):
Displayed 35 items.
- Proximal Methods for Sparse Optimal Scoring and Discriminant Analysis (Q97534) (← links)
- Optimized first-order methods for smooth convex minimization (Q312663) (← links)
- A version of the mirror descent method to solve variational inequalities (Q681901) (← links)
- Dual approaches to the minimization of strongly convex functionals with a simple structure under affine constraints (Q1683173) (← links)
- Convergence of first-order methods via the convex conjugate (Q1728354) (← links)
- Preconditioned accelerated gradient descent methods for locally Lipschitz smooth objectives with applications to the solution of nonlinear PDEs (Q1983570) (← links)
- Bounds for the tracking error of first-order online optimization methods (Q2032000) (← links)
- Fastest rates for stochastic mirror descent methods (Q2044496) (← links)
- Accelerating variance-reduced stochastic gradient methods (Q2118092) (← links)
- Social welfare and profit maximization from revealed preferences (Q2190403) (← links)
- Accelerated gradient-free optimization methods with a non-Euclidean proximal operator (Q2289040) (← links)
- Accelerated directional search with non-Euclidean prox-structure (Q2290400) (← links)
- Accelerated primal-dual gradient descent with linesearch for convex, nonconvex, and nonsmooth optimization problems (Q2313241) (← links)
- Nearly linear-time packing and covering LP solvers. Nearly linear-time packing and covering LP solvers, achieving width-independence and \(=(1/\varepsilon)\)-convergence (Q2414908) (← links)
- MAGMA: Multilevel Accelerated Gradient Mirror Descent Algorithm for Large-Scale Convex Composite Minimization (Q3179624) (← links)
- (Q4558559) (← links)
- Accelerated Methods for NonConvex Optimization (Q4571877) (← links)
- An Optimal First Order Method Based on Optimal Quadratic Averaging (Q4603040) (← links)
- Convergence Rates of Proximal Gradient Methods via the Convex Conjugate (Q4620416) (← links)
- The Approximate Duality Gap Technique: A Unified Theory of First-Order Methods (Q4629338) (← links)
- A universal modification of the linear coupling method (Q4631767) (← links)
- Unified Acceleration of High-Order Algorithms under General Hölder Continuity (Q5003214) (← links)
- The Walrasian equilibrium and centralized distributed optimization in terms of modern convex optimization methods on the example of resource allocation problem (Q5043038) (← links)
- Efficient numerical methods to solve sparse linear equations with application to PageRank (Q5043846) (← links)
- An Accelerated Method for Derivative-Free Smooth Stochastic Convex Optimization (Q5081777) (← links)
- Primal–dual accelerated gradient methods with small-dimensional relaxation oracle (Q5085262) (← links)
- Potential Function-Based Framework for Minimizing Gradients in Convex and Min-Max Optimization (Q5093649) (← links)
- Optimal Affine-Invariant Smooth Minimization Algorithms (Q5376450) (← links)
- Generalized Momentum-Based Methods: A Hamiltonian Perspective (Q5857293) (← links)
- Factor-\(\sqrt{2}\) acceleration of accelerated gradient methods (Q6073850) (← links)
- Optimistic optimisation of composite objective with exponentiated update (Q6097136) (← links)
- Direct nonlinear acceleration (Q6114957) (← links)
- No-regret dynamics in the Fenchel game: a unified framework for algorithmic convex optimization (Q6126650) (← links)
- Stochastic incremental mirror descent algorithms with Nesterov smoothing (Q6145577) (← links)
- The optimal dynamic regret for smoothed online convex optimization with squared \(l_2\) norm switching costs (Q6157294) (← links)