Complexity bounds for primal-dual methods minimizing the model of objective function
From MaRDI portal
Publication:1785201
DOI10.1007/S10107-017-1188-6zbMATH Open1397.90351OpenAlexW2140180727MaRDI QIDQ1785201FDOQ1785201
Authors: G. Richomme
Publication date: 28 September 2018
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10107-017-1188-6
Recommendations
- New analysis and results for the Frank-Wolfe method
- Primal-dual subgradient methods for convex problems
- Gradient methods for minimizing composite functions
- Analysis of the Frank-Wolfe method for convex composite optimization involving a logarithmically-homogeneous barrier
- Restarting Frank-Wolfe: faster rates under Hölderian error bounds
convex optimizationtrust-region methodconditional gradient methodcomplexity boundslinear optimization oracle
Cites Work
- Title not available (Why is that?)
- Introductory lectures on convex optimization. A basic course.
- Gradient methods for minimizing composite functions
- Primal-dual subgradient methods for convex problems
- Trust Region Methods
- Solving variational inequalities with monotone operators on domains given by linear minimization oracles
- Conditional gradient algorithms for norm-regularized smooth convex optimization
- Universal gradient methods for convex optimization problems
- New analysis and results for the Frank-Wolfe method
- Cubic regularization of Newton method and its global performance
- A Linearly Convergent Variant of the Conditional Gradient Algorithm under Strong Convexity, with Applications to Online and Stochastic Optimization
- Quasi-monotone subgradient methods for nonsmooth convex minimization
- Convergence Rates for Conditional Gradient Sequences Generated by Implicit Step Length Rules
- A regularization of the Frank-Wolfe method and unification of certain nonlinear programming methods
- Generalized conditional gradient for sparse estimation
Cited In (30)
- Technical note -- Dynamic data-driven estimation of nonparametric choice models
- Adaptive conditional gradient method
- Inexact model: a framework for optimization and variational inequalities
- First-order methods for convex optimization
- Analysis of the Frank-Wolfe method for convex composite optimization involving a logarithmically-homogeneous barrier
- Dual methods for finding equilibriums in mixed models of flow distribution in large transportation networks
- On the nonergodic convergence rate of an inexact augmented Lagrangian framework for composite convex programming
- Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity
- The approximate duality gap technique: a unified theory of first-order methods
- Inexact proximal stochastic second-order methods for nonconvex composite optimization
- Short paper -- A note on the Frank-Wolfe algorithm for a class of nonconvex and nonsmooth optimization problems
- High-order optimization methods for fully composite problems
- Universal Conditional Gradient Sliding for Convex Optimization
- A generalized Frank-Wolfe method with ``dual averaging for strongly convex composite optimization
- Generalized conditional gradient with augmented Lagrangian for composite minimization
- PCA Sparsified
- Affine Invariant Convergence Rates of the Conditional Gradient Method
- Nonsmooth projection-free optimization with functional constraints
- Gradient methods with memory
- Efficient numerical methods to solve sparse linear equations with application to PageRank
- Dual approaches to the minimization of strongly convex functionals with a simple structure under affine constraints
- Duality gap estimates for a class of greedy optimization algorithms in Banach spaces
- Affine-invariant contracting-point methods for convex optimization
- Generalized self-concordant analysis of Frank-Wolfe algorithms
- Fast gradient descent for convex minimization problems with an oracle producing a \(( \delta, L)\)-model of function at the requested point
- Duality gap estimates for weak Chebyshev greedy algorithms in Banach spaces
- Perturbed Fenchel duality and first-order methods
- Unified acceleration of high-order algorithms under general Hölder continuity
- Exact gradient methods with memory
- A unified analysis of stochastic gradient‐free Frank–Wolfe methods
This page was built for publication: Complexity bounds for primal-dual methods minimizing the model of objective function
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1785201)