Conditional gradient algorithms for norm-regularized smooth convex optimization
From MaRDI portal
Abstract: Motivated by some applications in signal processing and machine learning, we consider two convex optimization problems where, given a cone , a norm and a smooth convex function , we want either 1) to minimize the norm over the intersection of the cone and a level set of , or 2) to minimize over the cone the sum of and a multiple of the norm. We focus on the case where (a) the dimension of the problem is too large to allow for interior point algorithms, (b) is "too complicated" to allow for computationally cheap Bregman projections required in the first-order proximal gradient algorithms. On the other hand, we assume that {it is relatively easy to minimize linear forms over the intersection of and the unit -ball}. Motivating examples are given by the nuclear norm with being the entire space of matrices, or the positive semidefinite cone in the space of symmetric matrices, and the Total Variation norm on the space of 2D images. We discuss versions of the Conditional Gradient algorithm capable to handle our problems of interest, provide the related theoretical efficiency estimates and outline some applications.
Recommendations
- Improved complexities of conditional gradient-type methods with applications to robust matrix recovery problems
- On lower complexity bounds for large-scale smooth convex optimization
- A Linearly Convergent Variant of the Conditional Gradient Algorithm under Strong Convexity, with Applications to Online and Stochastic Optimization
- Simplified versions of the conditional gradient method
- On proximal gradient method for the convex problems regularized with the group reproducing kernel norm
Cites work
- scientific article; zbMATH DE number 3790208 (Why is no real title available?)
- scientific article; zbMATH DE number 3597791 (Why is no real title available?)
- scientific article; zbMATH DE number 3345848 (Why is no real title available?)
- A Singular Value Thresholding Algorithm for Matrix Completion
- Accuracy certificates for computational problems with convex structure
- An extension of the frank and Wolfe method of feasible directions
- An optimal method for stochastic composite optimization
- Conditional gradient algorithms for norm-regularized smooth convex optimization
- Conditional gradient algorithms with open loop step size rules
- Dual subgradient algorithms for large-scale nonsmooth learning problems
- Exact matrix completion via convex optimization
- Fixed point and Bregman iterative methods for matrix rank minimization
- Gradient methods for minimizing composite functions
- Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization
- Learning Theory
- Linearized augmented Lagrangian and alternating direction methods for nuclear norm minimization
- Modelling of landslides with the material-point method
- New variants of bundle methods
- Nonlinear total variation based noise removal algorithms
- On first-order algorithms for \(\ell_{1}/\)nuclear norm minimization
- Optimization with sparsity-inducing penalties
- Parallel stochastic gradient algorithms for large-scale matrix completion
- Randomized first order algorithms with applications to \(\ell _{1}\)-minimization
- Restricted simplicial decomposition for convex constrained problems
- Restricted simplicial decomposition: Computation and extensions
- Sparse Approximate Solutions to Semidefinite Programs
Cited in
(56)- A unified analysis of stochastic gradient‐free Frank–Wolfe methods
- Secant-inexact projection algorithms for solving a new class of constrained mixed generalized equations problems
- No-regret dynamics in the Fenchel game: a unified framework for algorithmic convex optimization
- Solving variational inequalities with monotone operators on domains given by linear minimization oracles
- Inexact model: a framework for optimization and variational inequalities
- Conditional gradient algorithms for norm-regularized smooth convex optimization
- Generalized conditional gradient for sparse estimation
- Improved complexities of conditional gradient-type methods with applications to robust matrix recovery problems
- A level-set method for convex optimization with a feasible solution path
- Analysis of the Frank-Wolfe method for convex composite optimization involving a logarithmically-homogeneous barrier
- The cyclic block conditional gradient method for convex optimization problems
- Quadratic decomposable submodular function minimization: theory and practice
- The sliding Frank-Wolfe algorithm and its application to super-resolution microscopy
- A distributed Frank-Wolfe framework for learning low-rank matrices with the trace norm
- Structured nonconvex and nonsmooth optimization: algorithms and iteration complexity analysis
- Low complexity regularization of linear inverse problems
- Decomposition techniques for bilinear saddle point problems and variational inequalities with affine monotone operators
- Projection-free accelerated method for convex optimization
- Generalized stochastic Frank-Wolfe algorithm with stochastic ``substitute gradient for structured convex optimization
- A new low-cost feasible projection algorithm for pseudomonotone variational inequalities
- On the nonergodic convergence rate of an inexact augmented Lagrangian framework for composite convex programming
- Conditional gradient method for multiobjective optimization
- New results on subgradient methods for strongly convex optimization problems with a unified analysis
- Level-set methods for convex optimization
- Frank--Wolfe Methods with an Unbounded Feasible Region and Applications to Structured Learning
- Scalable robust matrix recovery: Frank-Wolfe meets proximal methods
- On lower complexity bounds for large-scale smooth convex optimization
- Alternating conditional gradient method for convex feasibility problems
- Universal Conditional Gradient Sliding for Convex Optimization
- Generalized conditional gradient with augmented Lagrangian for composite minimization
- Frank-Wolfe-type methods for a class of nonconvex inequality-constrained problems
- New analysis and results for the Frank-Wolfe method
- Affine Invariant Convergence Rates of the Conditional Gradient Method
- Complementary composite minimization, small gradients in general norms, and applications
- Screening for a reweighted penalized conditional gradient method
- Improved complexities for stochastic conditional gradient methods under interpolation-like conditions
- Gradient iteration with \(\ell _{p}\)-norm constraints
- Conditional Gradient Methods for Convex Optimization with General Affine and Nonlinear Constraints
- Cut pursuit: fast algorithms to learn piecewise constant functions on general weighted graphs
- Efficient numerical methods to solve sparse linear equations with application to PageRank
- An extended Frank-Wolfe method with ``in-face directions, and its application to low-rank matrix completion
- The alternating descent conditional gradient method for sparse inverse problems
- Complexity bounds for primal-dual methods minimizing the model of objective function
- Noisy Euclidean Distance Realization: Robust Facial Reduction and the Pareto Frontier
- On the Frank-Wolfe algorithm for non-compact constrained optimization problems
- Generalized self-concordant analysis of Frank-Wolfe algorithms
- Fast gradient descent for convex minimization problems with an oracle producing a ( , L)-model of function at the requested point
- Conditional gradient sliding for convex optimization
- On the solution uniqueness characterization in the L1 norm and polyhedral gauge recovery
- Conditional gradient method for double-convex fractional programming matrix problems
- scientific article; zbMATH DE number 4057300 (Why is no real title available?)
- Frank-Wolfe and friends: a journey into projection-free first-order optimization methods
- scientific article; zbMATH DE number 7064051 (Why is no real title available?)
- A Linearly Convergent Variant of the Conditional Gradient Algorithm under Strong Convexity, with Applications to Online and Stochastic Optimization
- The Frank-Wolfe algorithm: a short introduction
- Conditional gradient type methods for composite nonlinear and stochastic optimization
This page was built for publication: Conditional gradient algorithms for norm-regularized smooth convex optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q494314)