Smooth Optimization with Approximate Gradient
From MaRDI portal
Publication:3395010
DOI10.1137/060676386zbMath1180.90378arXivmath/0512344OpenAlexW2121210949MaRDI QIDQ3395010
Publication date: 20 August 2009
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/math/0512344
Numerical mathematical programming methods (65K05) Semidefinite programming (90C22) Abstract computational complexity for mathematical programming problems (90C60)
Related Items (38)
Lower Bounds for Parallel and Randomized Convex Optimization ⋮ Sequential Subspace Optimization for Quasar-Convex Optimization Problems with Inexact Gradient ⋮ Accelerated gradient sliding for structured convex optimization ⋮ A frequency-domain analysis of inexact gradient methods ⋮ Generalized mirror prox algorithm for monotone variational inequalities: Universality and inexact oracle ⋮ Robust hybrid zero-order optimization algorithms with acceleration via averaging in time ⋮ Statistical Query Algorithms for Mean Vector Estimation and Stochastic Convex Optimization ⋮ Inexact first-order primal-dual algorithms ⋮ Semi-discrete optimal transport: hardness, regularization and numerical solution ⋮ Accelerated differential inclusion for convex optimization ⋮ Stopping rules for gradient methods for non-convex problems with additive noise in gradient ⋮ First-order methods of smooth convex optimization with inexact oracle ⋮ A dual gradient-projection algorithm for model predictive control in fixed-point arithmetic ⋮ Accelerated gradient methods with absolute and relative noise in the gradient ⋮ Gradient-Type Methods for Optimization Problems with Polyak-Łojasiewicz Condition: Early Stopping and Adaptivity to Inexactness Parameter ⋮ Primal-dual first-order methods with \({\mathcal {O}(1/\varepsilon)}\) iteration-complexity for cone programming ⋮ First-order methods for convex optimization ⋮ Differentially private inference via noisy optimization ⋮ Unifying framework for accelerated randomized methods in convex optimization ⋮ Worst-Case Convergence Analysis of Inexact Gradient and Newton Methods Through Semidefinite Programming Performance Estimation ⋮ An optimal method for stochastic composite optimization ⋮ Automatic alignment for three-dimensional tomographic reconstruction ⋮ HT-AWGM: a hierarchical Tucker-adaptive wavelet Galerkin method for high-dimensional elliptic problems ⋮ A generalization of Löwner-John's ellipsoid theorem ⋮ Composite convex optimization with global and local inexact oracles ⋮ An introduction to continuous optimization for imaging ⋮ An acceleration procedure for optimal first-order methods ⋮ Analysis of biased stochastic gradient descent using sequential semidefinite programs ⋮ Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization ⋮ New analysis and results for the Frank-Wolfe method ⋮ Decentralized and parallel primal and dual accelerated methods for stochastic convex programming problems ⋮ Harder, Better, Faster, Stronger Convergence Rates for Least-Squares Regression ⋮ Computing the Best Approximation over the Intersection of a Polyhedral Set and the Doubly Nonnegative Cone ⋮ An Accelerated Linearized Alternating Direction Method of Multipliers ⋮ Robust Accelerated Gradient Methods for Smooth Strongly Convex Functions ⋮ Universal intermediate gradient method for convex problems with inexact oracle ⋮ An inexact dual fast gradient-projection method for separable convex optimization with linear coupled constraints ⋮ Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization
Uses Software
This page was built for publication: Smooth Optimization with Approximate Gradient