First-order methods for convex optimization
DOI10.1016/j.ejco.2021.100015zbMath1516.90048arXiv2101.00935OpenAlexW3211250483MaRDI QIDQ6169988
Pavel Dvurechensky, Shimrit Shtern, Mathias Staudigl
Publication date: 12 July 2023
Published in: EURO Journal on Computational Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2101.00935
convex optimizationconvergence ratenumerical algorithmsBregman divergenceproximity operatorfirst-order methodscomposite optimizationproximal mapping
Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30) Research exposition (monographs, survey articles) pertaining to operations research and mathematical programming (90-02) Complexity and performance of numerical algorithms (65Y20)
Related Items
Cites Work
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- Primal-dual subgradient methods for convex problems
- Smooth minimization of non-smooth functions
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Gradient methods for minimizing composite functions
- First-order methods of smooth convex optimization with inexact oracle
- An optimal method for stochastic composite optimization
- Dual subgradient algorithms for large-scale nonsmooth learning problems
- Universal gradient methods for convex optimization problems
- Exact augmented Lagrangian duality for mixed integer linear programming
- Statistics for high-dimensional data. Methods, theory and applications.
- Alternating proximal algorithms for linearly constrained variational inequalities: application to domain decomposition for PDE's
- Regularized Lotka-Volterra dynamical system as continuous proximal-like method in optimization.
- Lectures on convex optimization
- Stochastic intermediate gradient method for convex problems with stochastic inexact oracle
- Dual extrapolation and its applications to solving variational inequalities and related problems
- Accelerating the cubic regularization of Newton's method on convex problems
- Projected subgradient methods with non-Euclidean distances for non-differentiable convex minimization and variational inequalities
- On the Douglas-Rachford splitting method and the proximal point algorithm for maximal monotone operators
- Proximal minimization algorithm with \(D\)-functions
- Condition number complexity of an elementary algorithm for computing a reliable solution of a conic linear system
- Stochastic mirror descent dynamics and their convergence in monotone variational inequalities
- A simplified view of first order methods for optimization
- From error bounds to the complexity of first-order descent methods for convex functions
- Dual approaches to the minimization of strongly convex functionals with a simple structure under affine constraints
- Universal method for stochastic composite optimization problems
- A conditional gradient method with linear rate of convergence for solving convex linear systems
- Complexity bounds for primal-dual methods minimizing the model of objective function
- Mirror descent and nonlinear projected subgradient methods for convex optimization.
- A first-order primal-dual algorithm for convex problems with applications to imaging
- An accelerated directional derivative method for smooth stochastic convex optimization
- Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
- Bregman forward-backward operator splitting
- Restarting algorithms: sometimes there is free lunch
- Gradient methods for problems with inexact model of the objective
- An adaptive primal-dual framework for nonsmooth convex minimization
- Fixing and extending some recent results on the ADMM algorithm
- Implementable tensor methods in unconstrained convex optimization
- Alternating direction method for covariance selection models
- Fast gradient descent for convex minimization problems with an oracle producing a \(( \delta, L)\)-model of function at the requested point
- Accelerated gradient-free optimization methods with a non-Euclidean proximal operator
- Accelerated directional search with non-Euclidean prox-structure
- Stochastic subgradient method converges on tame functions
- Restarting the accelerated coordinate descent method with a rough strong convexity estimate
- First-order and stochastic optimization methods for machine learning
- Accelerated primal-dual gradient descent with linesearch for convex, nonconvex, and nonsmooth optimization problems
- Universal method of searching for equilibria and stochastic equilibria in transportation networks
- Generalized uniformly optimal methods for nonlinear programming
- Linearly convergent away-step conditional gradient for non-strongly convex functions
- Random gradient-free minimization of convex functions
- Forward-backward splitting with Bregman distances
- Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity
- Linear convergence of first order methods for non-strongly convex optimization
- Mirror descent and convex optimization problems with non-smooth inequality constraints
- Recursive aggregation of estimators by the mirror descent algorithm with averaging
- Randomized first order algorithms with applications to \(\ell _{1}\)-minimization
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Complexity analysis of interior point algorithms for non-Lipschitz and nonconvex minimization
- Stochastic intermediate gradient method for convex optimization problems
- Generalized self-concordant analysis of Frank-Wolfe algorithms
- The Ordered Subsets Mirror Descent Optimization Method with Applications to Tomography
- Conditional Gradient Sliding for Convex Optimization
- Minimum-Volume Ellipsoids
- A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
- An Accelerated Hybrid Proximal Extragradient Method for Convex Optimization and Its Implications to Second-Order Methods
- Proximal Splitting Methods in Signal Processing
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Smoothing and First Order Methods: A Unified Framework
- Deterministic and stochastic primal-dual subgradient algorithms for uniformly convex minimization
- Duality Between Subgradient and Conditional Gradient Methods
- Efficiency of the Accelerated Coordinate Descent Method on Structured Optimization Problems
- Alternating Direction Algorithms for $\ell_1$-Problems in Compressive Sensing
- Fast Primal-Dual Gradient Method for Strongly Convex Minimization Problems with Linear Constraints
- Convergence Analysis of a Proximal-Like Minimization Algorithm Using Bregman Functions
- Smooth Optimization with Approximate Gradient
- Accelerated, Parallel, and Proximal Coordinate Descent
- Linearly Constrained Non-Lipschitz Optimization for Image Restoration
- A New Class of Alternating Proximal Minimization Algorithms with Costs-to-Move
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images
- Lectures on Stochastic Programming
- Optimal methods of smooth convex minimization
- Some comments on Wolfe's ‘away step’
- Generalized equations and their solutions, part II: Applications to nonlinear programming
- Applications of a Splitting Algorithm to Decomposition in Convex Programming and Variational Inequalities
- Estimating the Largest Eigenvalue by the Power and Lanczos Algorithms with a Random Start
- Entropic Proximal Mappings with Applications to Nonlinear Programming
- Monotone Operators and the Proximal Point Algorithm
- Simplicial decomposition in nonlinear programming algorithms
- Augmented Lagrangians and Applications of the Proximal Point Algorithm in Convex Programming
- Rates of Convergence for Conditional Gradient Algorithms Near Singular and Nonsingular Extremals
- An Interior Proximal Algorithm and the Exponential Multiplier Method for Semidefinite Programming
- Recursive algorithms, urn processes and chaining number of chain recurrent sets
- Variational Analysis
- Bregman Monotone Optimization Algorithms
- Barrier Operators and Associated Gradient-Like Dynamical Systems for Constrained Minimization Problems
- Using SeDuMi 1.02, A Matlab toolbox for optimization over symmetric cones
- Stochastic Methods for Composite and Weakly Convex Optimization Problems
- First-Order Methods in Optimization
- A Smooth Primal-Dual Optimization Framework for Nonsmooth Composite Convex Minimization
- On the Convergence of Gradient-Like Flows with Noisy Gradient Input
- Relatively Smooth Convex Optimization by First-Order Methods, and Applications
- A First-Order Primal-Dual Algorithm with Linesearch
- Optimization Methods for Large-Scale Machine Learning
- Non-convex Optimization for Machine Learning
- A variational perspective on accelerated methods in optimization
- FOM – a MATLAB toolbox of first-order methods for solving convex optimization problems
- An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
- Prox-Method with Rate of Convergence O(1/t) for Variational Inequalities with Lipschitz Continuous Monotone Operators and Smooth Convex-Concave Saddle Point Problems
- Hessian Riemannian Gradient Flows in Convex Programming
- Singular Riemannian barrier methods and gradient-projection dynamical systems for constrained optimization
- An extension of the frank and Wolfe method of feasible directions
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- Optimal Stochastic Approximation Algorithms for Strongly Convex Stochastic Composite Optimization I: A Generic Algorithmic Framework
- Katyusha: the first direct acceleration of stochastic gradient methods
- Rate of Convergence Analysis of Decomposition Methods Based on the Proximal Method of Multipliers for Convex Minimization
- An Accelerated Method for Derivative-Free Smooth Stochastic Convex Optimization
- Hessian Barrier Algorithms for Linearly Constrained Optimization Problems
- Excessive Gap Technique in Nonsmooth Convex Minimization
- Optimal Stochastic Approximation Algorithms for Strongly Convex Stochastic Composite Optimization, II: Shrinking Procedures and Optimal Algorithms
- Linear convergence of a modified Frank–Wolfe algorithm for computing minimum-volume enclosing ellipsoids
- Prediction, Learning, and Games
- Interior Gradient and Proximal Methods for Convex and Conic Optimization
- Signal Recovery by Proximal Forward-Backward Splitting
- Proximité et dualité dans un espace hilbertien
- Some methods of speeding up the convergence of iteration methods
- Weak convergence of the sequence of successive approximations for nonexpansive mappings
- A Tight Upper Bound on the Rate of Convergence of Frank-Wolfe Algorithm
- A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications
- A Linearly Convergent Variant of the Conditional Gradient Algorithm under Strong Convexity, with Applications to Online and Stochastic Optimization
- A Stochastic Approximation Method
- Compressed sensing
- The direct extension of ADMM for multi-block convex minimization problems is not necessarily convergent
- New analysis and results for the Frank-Wolfe method
- Hyperfast second-order local solvers for efficient statistically preconditioned distributed optimization
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item