Generalized stochastic Frank-Wolfe algorithm with stochastic ``substitute gradient for structured convex optimization
DOI10.1007/S10107-020-01480-7zbMATH Open1465.90063arXiv1807.07680OpenAlexW3009614481MaRDI QIDQ2020608FDOQ2020608
Authors: Haihao Lu, Robert M. Freund
Publication date: 23 April 2021
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1807.07680
Recommendations
- Frank-Wolfe style algorithms for large scale optimization
- Generalized self-concordant analysis of Frank-Wolfe algorithms
- New analysis and results for the Frank-Wolfe method
- Frank--Wolfe Methods with an Unbounded Feasible Region and Applications to Structured Learning
- Fast and scalable Lasso via stochastic Frank-Wolfe methods with a convergence guarantee
Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Analysis of algorithms and problem complexity (68Q25)
Cites Work
- Title not available (Why is that?)
- Ridge Regression: Biased Estimation for Nonorthogonal Problems
- High-dimensional Ising model selection using \(\ell _{1}\)-regularized logistic regression
- Exact matrix completion via convex optimization
- Mirror descent and nonlinear projected subgradient methods for convex optimization.
- CUR matrix decompositions for improved data analysis
- Incremental majorization-minimization optimization with application to large-scale machine learning
- Title not available (Why is that?)
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Efficiency of coordinate descent methods on huge-scale optimization problems
- Subgradient methods for huge-scale optimization problems
- Conditional gradient algorithms for norm-regularized smooth convex optimization
- Title not available (Why is that?)
- Stochastic dual coordinate ascent methods for regularized loss minimization
- Title not available (Why is that?)
- Duality between subgradient and conditional gradient methods
- Stochastic primal-dual coordinate method for regularized empirical risk minimization
- On the complexity analysis of randomized block-coordinate descent methods
- Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
- New analysis and results for the Frank-Wolfe method
- Regularization techniques for learning with matrices
- Forward-backward splitting with Bregman distances
- Online submodular minimization
- Minimizing finite sums with the stochastic average gradient
- Conditional gradient sliding for convex optimization
- Relatively smooth convex optimization by first-order methods, and applications
- A descent lemma beyond Lipschitz gradient continuity: first-order methods revisited and applications
- Stochastic conditional gradient methods: from convex minimization to submodular maximization
- An accelerated randomized proximal coordinate gradient method and its application to regularized empirical risk minimization
- Generalized conditional gradient for sparse estimation
- Katyusha: the first direct acceleration of stochastic gradient methods
- An extended Frank-Wolfe method with ``in-face directions, and its application to low-rank matrix completion
Cited In (11)
- Inexact and stochastic generalized conditional gradient with augmented Lagrangian and proximal step
- Using Taylor-approximated gradients to improve the Frank-Wolfe method for empirical risk minimization
- Analysis of the Frank-Wolfe method for convex composite optimization involving a logarithmically-homogeneous barrier
- Frank-Wolfe style algorithms for large scale optimization
- Frank--Wolfe Methods with an Unbounded Feasible Region and Applications to Structured Learning
- A generalized Frank-Wolfe method with ``dual averaging for strongly convex composite optimization
- Fast and scalable Lasso via stochastic Frank-Wolfe methods with a convergence guarantee
- Sequential quadratic optimization for nonlinear equality constrained stochastic optimization
- FrankWolfe.jl: A High-Performance and Flexible Toolbox for Frank–Wolfe Algorithms and Conditional Gradients
- A sequential quadratic programming method with high-probability complexity bounds for nonlinear equality-constrained stochastic optimization
- No-regret dynamics in the Fenchel game: a unified framework for algorithmic convex optimization
Uses Software
This page was built for publication: Generalized stochastic Frank-Wolfe algorithm with stochastic ``substitute gradient for structured convex optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2020608)