First-order methods of smooth convex optimization with inexact oracle
DOI10.1007/S10107-013-0677-5zbMATH Open1317.90196OpenAlexW2124768887MaRDI QIDQ403634FDOQ403634
Olivier Devolder, Yuri Nesterov, François Glineur
Publication date: 29 August 2014
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10107-013-0677-5
first-order methodsgradient methodsinexact oraclesmooth convex optimizationlarge-scale optimizationfast gradient methodscomplexity bounds
Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Abstract computational complexity for mathematical programming problems (90C60)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Smooth minimization of non-smooth functions
- Introductory lectures on convex optimization. A basic course.
- Excessive Gap Technique in Nonsmooth Convex Minimization
- Convergence of some algorithms for convex minimization
- Smooth Optimization with Approximate Gradient
- An optimal method for stochastic composite optimization
- A Proximal Bundle Method with Approximate Subgradient Linearizations
- A proximal bundle method based on approximate subgradients
- The effect of deterministic noise in subgradient methods
- Double smoothing technique for large-scale linearly constrained convex optimization
- Smoothing technique and its applications in semidefinite optimization
- Optimal methods of smooth convex minimization
Cited In (only showing first 100 items - show all)
- A universal modification of the linear coupling method
- Title not available (Why is that?)
- An accelerated first-order method for non-convex optimization on manifolds
- Optimal Algorithms for Stochastic Complementary Composite Minimization
- Statistical Query Algorithms for Mean Vector Estimation and Stochastic Convex Optimization
- HT-AWGM: a hierarchical Tucker-adaptive wavelet Galerkin method for high-dimensional elliptic problems
- Inexact model: a framework for optimization and variational inequalities
- Variable Projection for NonSmooth Problems
- Analogues of Switching Subgradient Schemes for Relatively Lipschitz-Continuous Convex Programming Problems
- Iteration-Complexity of First-Order Augmented Lagrangian Methods for Convex Conic Programming
- Decentralized saddle-point problems with different constants of strong convexity and strong concavity
- Small errors in random zeroth-order optimization are imaginary
- Optimal Affine-Invariant Smooth Minimization Algorithms
- Stopping rules for gradient methods for non-convex problems with additive noise in gradient
- Universal intermediate gradient method for convex problems with inexact oracle
- Accelerated Extra-Gradient Descent: A Novel Accelerated First-Order Method
- Adaptive constraint satisfaction for Markov decision process congestion games: application to transportation networks
- Implicit regularization with strongly convex bias: Stability and acceleration
- Distributed optimization with inexact oracle
- A nonlinear conjugate gradient method using inexact first-order information
- Accelerated Iterative Regularization via Dual Diagonal Descent
- Variable Projection for NonSmooth Problems
- Augmented Lagrangian optimization under fixed-point arithmetic
- A frequency-domain analysis of inexact gradient methods
- Complexity Certifications of First-Order Inexact Lagrangian Methods for General Convex Programming: Application to Real-Time MPC
- Universal Conditional Gradient Sliding for Convex Optimization
- Accelerated randomized mirror descent algorithms for composite non-strongly convex optimization
- Generalized mirror prox algorithm for monotone variational inequalities: Universality and inexact oracle
- Accelerated gradient methods with absolute and relative noise in the gradient
- Recent theoretical advances in decentralized distributed convex optimization
- Unifying framework for accelerated randomized methods in convex optimization
- Complementary composite minimization, small gradients in general norms, and applications
- A simple method for convex optimization in the oracle model
- Utility/privacy trade-off as regularized optimal transport
- Algorithms with gradient clipping for stochastic optimization with heavy-tailed noise
- A dual approach for optimal algorithms in distributed optimization over networks
- A subgradient method for free material design
- Gradient-Type Methods for Optimization Problems with Polyak-Łojasiewicz Condition: Early Stopping and Adaptivity to Inexactness Parameter
- Truncated Cauchy random perturbations for smoothed functional-based stochastic optimization
- Inexact reduced gradient methods in nonconvex optimization
- Online First-Order Framework for Robust Convex Optimization
- High-probability complexity bounds for non-smooth stochastic convex optimization with heavy-tailed noise
- Sequential Subspace Optimization for Quasar-Convex Optimization Problems with Inexact Gradient
- Differentially private inference via noisy optimization
- Primal-dual \(\varepsilon\)-subgradient method for distributed optimization
- Efficient Search of First-Order Nash Equilibria in Nonconvex-Concave Smooth Min-Max Problems
- Accelerated first-order methods for a class of semidefinite programs
- Some adaptive first-order methods for variational inequalities with relatively strongly monotone operators and generalized smoothness
- Hyperfast second-order local solvers for efficient statistically preconditioned distributed optimization
- A New Homotopy Proximal Variable-Metric Framework for Composite Convex Minimization
- Title not available (Why is that?)
- Exact gradient methods with memory
- Principled analyses and design of first-order methods with inexact proximal operators
- A heuristic adaptive fast gradient method in stochastic optimization problems
- Optimization for deep learning: an overview
- Towards accelerated rates for distributed optimization over time-varying networks
- Computing the Best Approximation over the Intersection of a Polyhedral Set and the Doubly Nonnegative Cone
- General convergence analysis of stochastic first-order methods for composite optimization
- On the Nonergodic Convergence Rate of an Inexact Augmented Lagrangian Framework for Composite Convex Programming
- A family of subgradient-based methods for convex optimization problems in a unifying framework
- Composite convex optimization with global and local inexact oracles
- An inexact dual fast gradient-projection method for separable convex optimization with linear coupled constraints
- Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization
- Stochastic intermediate gradient method for convex optimization problems
- Inexact coordinate descent: complexity and preconditioning
- Analysis of Optimization Algorithms via Integral Quadratic Constraints: Nonstrongly Convex Problems
- Management of a hydropower system via convex duality
- Robust Accelerated Gradient Methods for Smooth Strongly Convex Functions
- Smoothed Variable Sample-Size Accelerated Proximal Methods for Nonsmooth Stochastic Convex Programs
- An adaptive accelerated first-order method for convex optimization
- Flexible low-rank statistical modeling with missing data and side information
- Worst-Case Convergence Analysis of Inexact Gradient and Newton Methods Through Semidefinite Programming Performance Estimation
- Distributed optimal coordination for multiple heterogeneous Euler-Lagrangian systems
- Certification aspects of the fast gradient method for solving the dual of parametric convex programs
- First-order methods for convex optimization
- On the convergence rate of scaled gradient projection method
- A new convergence analysis and perturbation resilience of some accelerated proximal forward–backward algorithms with errors
- Structured nonconvex and nonsmooth optimization: algorithms and iteration complexity analysis
- Inexact first-order primal-dual algorithms
- Accelerated gradient boosting
- On the oracle complexity of first-order and derivative-free algorithms for smooth nonconvex minimization
- OSGA: a fast subgradient algorithm with optimal complexity
- A flexible coordinate descent method
- Global convergence rate analysis of unconstrained optimization methods based on probabilistic models
- Convergence Analysis of Inexact Randomized Iterative Methods
- Bounds for the tracking error of first-order online optimization methods
- Universal gradient methods for convex optimization problems
- Scheduled Restart Momentum for Accelerated Stochastic Gradient Descent
- Analysis and Design of Optimization Algorithms via Integral Quadratic Constraints
- Achieving Geometric Convergence for Distributed Optimization Over Time-Varying Graphs
- New results on subgradient methods for strongly convex optimization problems with a unified analysis
- Efficient first-order methods for convex minimization: a constructive approach
- Convergence rates of accelerated proximal gradient algorithms under independent noise
- Lower complexity bounds of first-order methods for convex-concave bilinear saddle-point problems
- First-order Methods for the Impatient: Support Identification in Finite Time with Convergent Frank--Wolfe Variants
- Stochastic intermediate gradient method for convex problems with stochastic inexact oracle
- Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity
- Efficiency of minimizing compositions of convex functions and smooth maps
- From Infinite to Finite Programs: Explicit Error Bounds with Applications to Approximate Dynamic Programming
- Decentralized and parallel primal and dual accelerated methods for stochastic convex programming problems
This page was built for publication: First-order methods of smooth convex optimization with inexact oracle
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q403634)