Fast gradient methods for uniformly convex and weakly smooth problems
From MaRDI portal
Abstract: In this paper, acceleration of gradient methods for convex optimization problems with weak levels of convexity and smoothness is considered. Starting from the universal fast gradient method which was designed to be an optimal method for weakly smooth problems whose gradients are H"older continuous, its momentum is modified appropriately so that it can also accommodate uniformly convex and weakly smooth problems. Different from the existing works, fast gradient methods proposed in this paper do not use the restarting technique but use momentums that are suitably designed to reflect both the uniform convexity and the weak smoothness information of the target energy function. Both theoretical and numerical results that support the superiority of proposed methods are presented.
Recommendations
- Universal gradient methods for convex optimization problems
- Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity
- New results on subgradient methods for strongly convex optimization problems with a unified analysis
- An optimal gradient method for smooth strongly convex minimization
- Gradient methods for minimizing composite functions
Cites work
- scientific article; zbMATH DE number 3850830 (Why is no real title available?)
- 64.4 Some logarithm inequalities
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A Multigrid Algorithm for the p-Laplacian
- A block coordinate descent method for regularized multiconvex optimization with applications to nonnegative tensor factorization and completion
- A simple nearly optimal restart scheme for speeding up first-order methods
- Adaptive restart for accelerated gradient schemes
- Adaptive restart of accelerated gradient methods under local quadratic growth condition
- Additive Schwarz methods for convex optimization as gradient methods
- An introduction to continuous optimization for imaging
- Backtracking strategies for accelerated descent methods with smooth composite objectives
- Characteristic inequalities of uniformly convex and uniformly smooth Banach spaces
- Convex analysis and monotone operator theory in Hilbert spaces
- First-order methods of smooth convex optimization with inexact oracle
- Generalizing the optimized gradient method for smooth convex minimization
- Gradient methods for minimizing composite functions
- Lectures on convex optimization
- Minimization of functions having Lipschitz continuous first partial derivatives
- Optimal methods of smooth convex minimization
- Optimized first-order methods for smooth convex minimization
- Performance of first-order methods for smooth convex minimization: a novel approach
- Preconditioned descent algorithms for \(p\)-Laplacian
- Preconditioned steepest descent methods for some nonlinear elliptic equations involving p-Laplacian terms
- Pseudo-linear convergence of an additive Schwarz method for dual total variation minimization
- Sharpness, restart, and acceleration
- Smooth minimization of non-smooth functions
- The steepest descent algorithm without line search for \(p\)-Laplacian
- The Łojasiewicz Inequality for Nonsmooth Subanalytic Functions with Applications to Subgradient Dynamical Systems
- Uniformly convex and uniformly smooth convex functions
- Universal gradient methods for convex optimization problems
Cited in
(15)- A family of subgradient-based methods for convex optimization problems in a unifying framework
- A universal modification of the linear coupling method
- Dynamic smoothness parameter for fast gradient methods
- An Optimized Dynamic Mode Decomposition Model Robust to Multiplicative Noise
- scientific article; zbMATH DE number 4041186 (Why is no real title available?)
- Accelerated methods for weakly-quasi-convex optimization problems
- Fast proximity-gradient algorithms for structured convex optimization problems
- New results on subgradient methods for strongly convex optimization problems with a unified analysis
- A universal accelerated primal-dual method for convex optimization problems
- Uniform rank gradient, cost, and local-global convergence
- Fast Curvature Matrix-Vector Products for Second-Order Gradient Descent
- Fast and safe: accelerated gradient methods with optimality certificates and underestimate sequences
- Solving convex min-min problems with smoothness and strong convexity in one group of variables and low dimension in the other
- Faster p-norm minimizing flows, via smoothed q-norm problems
- Convex Synthesis of Accelerated Gradient Algorithms
This page was built for publication: Fast gradient methods for uniformly convex and weakly smooth problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2673504)