The exact information-based complexity of smooth convex minimization
From MaRDI portal
Publication:511109
Abstract: We obtain a new lower bound on the information-based complexity of first-order minimization of smooth and convex functions. We show that the bound matches the worst-case performance of the recently introduced Optimized Gradient Method, thereby establishing that the bound is tight and can be realized by an efficient algorithm. The proof is based on a novel construction technique of smooth and convex functions.
Recommendations
- scientific article; zbMATH DE number 3516928
- Lower Bounds on the Oracle Complexity of Nonsmooth Convex Optimization via Information Theory
- Information complexity of mixed-integer convex optimization
- Information-Based Complexity, Feedback and Dynamics in Convex Programming
- Information-Theoretic Lower Bounds on the Oracle Complexity of Stochastic Convex Optimization
- On the oracle complexity of smooth strongly convex minimization
- Information complexity of functional optimization problems and their approximation schemes
- On the minimal cost of approximating linear problems based on information with deterministic noise
- Informational inequalities in the problem of gradient stochastic optimization and optimal realizable algorithms
- Sequential convex programming for computing information-theoretic minimal partitions: nonconvex nonsmooth optimization
Cites Work
- scientific article; zbMATH DE number 3790208 (Why is no real title available?)
- scientific article; zbMATH DE number 1206370 (Why is no real title available?)
- An optimal variant of Kelley's cutting-plane method
- Information-Based Complexity, Feedback and Dynamics in Convex Programming
- Information-Theoretic Lower Bounds on the Oracle Complexity of Stochastic Convex Optimization
- Information-based complexity of linear operator equations
- Introductory lectures on convex optimization. A basic course.
- On complexity of stochastic programming problems
- On lower complexity bounds for large-scale smooth convex optimization
- Optimized first-order methods for smooth convex minimization
- Performance of first-order methods for smooth convex minimization: a novel approach
Cited In (26)
- On the properties of convex functions over open sets
- Potential Function-Based Framework for Minimizing Gradients in Convex and Min-Max Optimization
- On the oracle complexity of smooth strongly convex minimization
- On lower and upper bounds in smooth and strongly convex optimization
- Oracle complexity of second-order methods for smooth convex optimization
- Bilevel Methods for Image Reconstruction
- Another look at the fast iterative shrinkage/thresholding algorithm (FISTA)
- Generalizing the optimized gradient method for smooth convex minimization
- Operator splitting performance estimation: tight contraction factors and optimal parameter selection
- On the minimal cost of approximating linear problems based on information with deterministic noise
- Exact worst-case performance of first-order methods for composite convex optimization
- Some worst-case datasets of deterministic first-order methods for solving binary logistic regression
- Efficient first-order methods for convex minimization: a constructive approach
- Optimal complexity and certification of Bregman first-order methods
- On lower complexity bounds for large-scale smooth convex optimization
- Factor-\(\sqrt{2}\) acceleration of accelerated gradient methods
- Adaptive restart of the optimized gradient method for convex optimization
- Exact worst-case convergence rates of the proximal gradient method for composite convex minimization
- Optimizing the efficiency of first-order methods for decreasing the gradient of smooth convex functions
- An optimal gradient method for smooth strongly convex minimization
- On parallel complexity of nonsmooth convex optimization
- On the convergence analysis of the optimized gradient method
- Informational inequalities in the problem of gradient stochastic optimization and optimal realizable algorithms
- Fully polynomial time \((\Sigma,\Pi)\)-approximation schemes for continuous nonlinear newsvendor and continuous stochastic dynamic programs
- Branch-and-bound performance estimation programming: a unified methodology for constructing optimal optimization methods
- Information complexity of mixed-integer convex optimization
Uses Software
This page was built for publication: The exact information-based complexity of smooth convex minimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q511109)