Pages that link to "Item:Q1785201"
From MaRDI portal
The following pages link to Complexity bounds for primal-dual methods minimizing the model of objective function (Q1785201):
Displaying 30 items.
- Dual approaches to the minimization of strongly convex functionals with a simple structure under affine constraints (Q1683173) (← links)
- Dual methods for finding equilibriums in mixed models of flow distribution in large transportation networks (Q1757624) (← links)
- Duality gap estimates for a class of greedy optimization algorithms in Banach spaces (Q2117632) (← links)
- Fast gradient descent for convex minimization problems with an oracle producing a \(( \delta, L)\)-model of function at the requested point (Q2278192) (← links)
- Adaptive conditional gradient method (Q2278897) (← links)
- Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity (Q2311123) (← links)
- Duality gap estimates for weak Chebyshev greedy algorithms in Banach spaces (Q2337146) (← links)
- Affine-invariant contracting-point methods for convex optimization (Q2687041) (← links)
- Generalized self-concordant analysis of Frank-Wolfe algorithms (Q2687046) (← links)
- Perturbed Fenchel duality and first-order methods (Q2687051) (← links)
- The Approximate Duality Gap Technique: A Unified Theory of First-Order Methods (Q4629338) (← links)
- Generalized Conditional Gradient with Augmented Lagrangian for Composite Minimization (Q4971021) (← links)
- Unified Acceleration of High-Order Algorithms under General Hölder Continuity (Q5003214) (← links)
- Technical Note—Dynamic Data-Driven Estimation of Nonparametric Choice Models (Q5031619) (← links)
- Efficient numerical methods to solve sparse linear equations with application to PageRank (Q5043846) (← links)
- Gradient methods with memory (Q5043847) (← links)
- Exact gradient methods with memory (Q5058416) (← links)
- Inexact proximal stochastic second-order methods for nonconvex composite optimization (Q5135256) (← links)
- On the Nonergodic Convergence Rate of an Inexact Augmented Lagrangian Framework for Composite Convex Programming (Q5219732) (← links)
- Inexact model: a framework for optimization and variational inequalities (Q5865338) (← links)
- High-Order Optimization Methods for Fully Composite Problems (Q5869820) (← links)
- Analysis of the Frank-Wolfe method for convex composite optimization involving a logarithmically-homogeneous barrier (Q6038640) (← links)
- Universal Conditional Gradient Sliding for Convex Optimization (Q6071883) (← links)
- Affine Invariant Convergence Rates of the Conditional Gradient Method (Q6076864) (← links)
- A unified analysis of stochastic gradient‐free Frank–Wolfe methods (Q6092499) (← links)
- Short paper -- A note on the Frank-Wolfe algorithm for a class of nonconvex and nonsmooth optimization problems (Q6114895) (← links)
- A generalized Frank-Wolfe method with ``dual averaging'' for strongly convex composite optimization (Q6164957) (← links)
- First-order methods for convex optimization (Q6169988) (← links)
- PCA Sparsified (Q6176425) (← links)
- Nonsmooth projection-free optimization with functional constraints (Q6642795) (← links)