Optimistic optimisation of composite objective with exponentiated update
From MaRDI portal
Publication:6097136
Abstract: This paper proposes a new family of algorithms for the online optimisation of composite objectives. The algorithms can be interpreted as the combination of the exponentiated gradient and -norm algorithm. Combined with algorithmic ideas of adaptivity and optimism, the proposed algorithms achieve a sequence-dependent regret upper bound, matching the best-known bounds for sparse target decision variables. Furthermore, the algorithms have efficient implementations for popular composite objectives and constraints and can be converted to stochastic optimisation algorithms with the optimal accelerated rate for smooth objectives.
Recommendations
- A modular analysis of adaptive (non-)convex optimization: optimism, composite objectives, and variational bounds
- A modular analysis of adaptive (non-)convex optimization: optimism, composite objectives, variance reduction, and variational bounds
- Logarithmic Regret Algorithms for Online Convex Optimization
- Logarithmic regret algorithms for online convex optimization
Cites work
- scientific article; zbMATH DE number 823379 (Why is no real title available?)
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A generalized online mirror descent with applications to classification and regression
- Adaptive subgradient methods for online learning and stochastic optimization
- Convexity and optimization in Banach spaces.
- Exponentiated gradient versus gradient descent for linear predictors
- First-order and stochastic optimization methods for machine learning
- Improved Risk Tail Bounds for On-Line Algorithms
- Introductory lectures on convex optimization. A basic course.
- Linear coupling: an ultimate unification of gradient and mirror descent
- On the Generalization Ability of On-Line Learning Algorithms
- Optimal Rates for Zero-Order Convex Optimization: The Power of Two Function Evaluations
- Regularization techniques for learning with matrices
- Scale-free online learning
- Smoothed Low Rank and Sparse Matrix Recovery by Iteratively Reweighted Least Squares Minimization
- The multiplicative weights update method: a meta-algorithm and applications
- The robustness of the \(p\)-norm algorithms
Cited in
(2)
This page was built for publication: Optimistic optimisation of composite objective with exponentiated update
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6097136)