First-order methods of smooth convex optimization with inexact oracle
DOI10.1007/S10107-013-0677-5zbMATH Open1317.90196OpenAlexW2124768887MaRDI QIDQ403634FDOQ403634
Olivier Devolder, Yuri Nesterov, François Glineur
Publication date: 29 August 2014
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10107-013-0677-5
Recommendations
- Accelerated bundle level methods with inexact oracle
- Composite convex optimization with global and local inexact oracles
- Universal intermediate gradient method for convex problems with inexact oracle
- Smoothing and first order methods: a unified framework
- Stochastic intermediate gradient method for convex problems with stochastic inexact oracle
first-order methodsgradient methodsinexact oraclesmooth convex optimizationlarge-scale optimizationfast gradient methodscomplexity bounds
Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Abstract computational complexity for mathematical programming problems (90C60)
Cites Work
- Smooth minimization of non-smooth functions
- Introductory lectures on convex optimization. A basic course.
- Title not available (Why is that?)
- Title not available (Why is that?)
- Excessive Gap Technique in Nonsmooth Convex Minimization
- Title not available (Why is that?)
- Convergence of some algorithms for convex minimization
- Smooth Optimization with Approximate Gradient
- An optimal method for stochastic composite optimization
- A Proximal Bundle Method with Approximate Subgradient Linearizations
- A proximal bundle method based on approximate subgradients
- The effect of deterministic noise in subgradient methods
- Double smoothing technique for large-scale linearly constrained convex optimization
- Smoothing technique and its applications in semidefinite optimization
- Optimal methods of smooth convex minimization
- Title not available (Why is that?)
Cited In (only showing first 100 items - show all)
- General convergence analysis of stochastic first-order methods for composite optimization
- On the Nonergodic Convergence Rate of an Inexact Augmented Lagrangian Framework for Composite Convex Programming
- A family of subgradient-based methods for convex optimization problems in a unifying framework
- Composite convex optimization with global and local inexact oracles
- An inexact dual fast gradient-projection method for separable convex optimization with linear coupled constraints
- Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization
- Stochastic intermediate gradient method for convex optimization problems
- Inexact coordinate descent: complexity and preconditioning
- Analysis of Optimization Algorithms via Integral Quadratic Constraints: Nonstrongly Convex Problems
- Management of a hydropower system via convex duality
- Robust Accelerated Gradient Methods for Smooth Strongly Convex Functions
- Smoothed Variable Sample-Size Accelerated Proximal Methods for Nonsmooth Stochastic Convex Programs
- An adaptive accelerated first-order method for convex optimization
- Flexible low-rank statistical modeling with missing data and side information
- Worst-Case Convergence Analysis of Inexact Gradient and Newton Methods Through Semidefinite Programming Performance Estimation
- Distributed optimal coordination for multiple heterogeneous Euler-Lagrangian systems
- Certification aspects of the fast gradient method for solving the dual of parametric convex programs
- First-order methods for convex optimization
- On the convergence rate of scaled gradient projection method
- A new convergence analysis and perturbation resilience of some accelerated proximal forward–backward algorithms with errors
- Structured nonconvex and nonsmooth optimization: algorithms and iteration complexity analysis
- Inexact first-order primal-dual algorithms
- Accelerated gradient boosting
- On the oracle complexity of first-order and derivative-free algorithms for smooth nonconvex minimization
- OSGA: a fast subgradient algorithm with optimal complexity
- A flexible coordinate descent method
- Global convergence rate analysis of unconstrained optimization methods based on probabilistic models
- Convergence Analysis of Inexact Randomized Iterative Methods
- Bounds for the tracking error of first-order online optimization methods
- Universal gradient methods for convex optimization problems
- Scheduled Restart Momentum for Accelerated Stochastic Gradient Descent
- Analysis and Design of Optimization Algorithms via Integral Quadratic Constraints
- Achieving Geometric Convergence for Distributed Optimization Over Time-Varying Graphs
- New results on subgradient methods for strongly convex optimization problems with a unified analysis
- Efficient first-order methods for convex minimization: a constructive approach
- Convergence rates of accelerated proximal gradient algorithms under independent noise
- Lower complexity bounds of first-order methods for convex-concave bilinear saddle-point problems
- First-order Methods for the Impatient: Support Identification in Finite Time with Convergent Frank--Wolfe Variants
- Stochastic intermediate gradient method for convex problems with stochastic inexact oracle
- Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity
- Efficiency of minimizing compositions of convex functions and smooth maps
- From Infinite to Finite Programs: Explicit Error Bounds with Applications to Approximate Dynamic Programming
- Decentralized and parallel primal and dual accelerated methods for stochastic convex programming problems
- Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
- New analysis and results for the Frank-Wolfe method
- Universal method for stochastic composite optimization problems
- Exact worst-case convergence rates of the proximal gradient method for composite convex minimization
- Optimal subgradient algorithms for large-scale convex optimization in simple domains
- On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients
- Solving structured nonsmooth convex optimization with complexity \(\mathcal {O}(\varepsilon ^{-1/2})\)
- Rate of convergence analysis of discretization and smoothing algorithms for semiinfinite minimax problems
- Complexity of first-order inexact Lagrangian and penalty methods for conic convex programming
- On the resolution of misspecified convex optimization and monotone variational inequality problems
- Accelerated methods for saddle-point problem
- Fast gradient descent for convex minimization problems with an oracle producing a \(( \delta, L)\)-model of function at the requested point
- Generalized uniformly optimal methods for nonlinear programming
- Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice
- CGIHT: conjugate gradient iterative hard thresholding for compressed sensing and matrix completion
- Generalized maximum entropy estimation
- Stochastic Model-Based Minimization of Weakly Convex Functions
- Robust hybrid zero-order optimization algorithms with acceleration via averaging in time
- An optimal subgradient algorithm with subspace search for costly convex optimization problems
- Fast proximal algorithms for nonsmooth convex optimization
- Fast gradient methods for uniformly convex and weakly smooth problems
- Bundle-level type methods uniformly optimal for smooth and nonsmooth convex optimization
- Frank-Wolfe and friends: a journey into projection-free first-order optimization methods
- Harder, Better, Faster, Stronger Convergence Rates for Least-Squares Regression
- Adaptive inexact fast augmented Lagrangian methods for constrained convex optimization
- Analysis of biased stochastic gradient descent using sequential semidefinite programs
- A primal majorized semismooth Newton-CG augmented Lagrangian method for large-scale linearly constrained convex programming
- An Accelerated Method for Derivative-Free Smooth Stochastic Convex Optimization
- Conditional gradient type methods for composite nonlinear and stochastic optimization
- A universal modification of the linear coupling method
- Title not available (Why is that?)
- An accelerated first-order method for non-convex optimization on manifolds
- Optimal Algorithms for Stochastic Complementary Composite Minimization
- Statistical Query Algorithms for Mean Vector Estimation and Stochastic Convex Optimization
- HT-AWGM: a hierarchical Tucker-adaptive wavelet Galerkin method for high-dimensional elliptic problems
- Inexact model: a framework for optimization and variational inequalities
- Variable Projection for NonSmooth Problems
- Analogues of Switching Subgradient Schemes for Relatively Lipschitz-Continuous Convex Programming Problems
- Iteration-Complexity of First-Order Augmented Lagrangian Methods for Convex Conic Programming
- Decentralized saddle-point problems with different constants of strong convexity and strong concavity
- Small errors in random zeroth-order optimization are imaginary
- Optimal Affine-Invariant Smooth Minimization Algorithms
- Stopping rules for gradient methods for non-convex problems with additive noise in gradient
- Universal intermediate gradient method for convex problems with inexact oracle
- Accelerated Extra-Gradient Descent: A Novel Accelerated First-Order Method
- Adaptive constraint satisfaction for Markov decision process congestion games: application to transportation networks
- Implicit regularization with strongly convex bias: Stability and acceleration
- Distributed optimization with inexact oracle
- A nonlinear conjugate gradient method using inexact first-order information
- Accelerated Iterative Regularization via Dual Diagonal Descent
- Variable Projection for NonSmooth Problems
- Augmented Lagrangian optimization under fixed-point arithmetic
- A frequency-domain analysis of inexact gradient methods
- Complexity Certifications of First-Order Inexact Lagrangian Methods for General Convex Programming: Application to Real-Time MPC
- Universal Conditional Gradient Sliding for Convex Optimization
- Accelerated randomized mirror descent algorithms for composite non-strongly convex optimization
- Generalized mirror prox algorithm for monotone variational inequalities: Universality and inexact oracle
This page was built for publication: First-order methods of smooth convex optimization with inexact oracle
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q403634)