First-order methods of smooth convex optimization with inexact oracle

From MaRDI portal
Publication:403634

DOI10.1007/s10107-013-0677-5zbMath1317.90196OpenAlexW2124768887MaRDI QIDQ403634

Olivier Devolder, François Glineur, Yu. E. Nesterov

Publication date: 29 August 2014

Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1007/s10107-013-0677-5



Related Items

An adaptive accelerated first-order method for convex optimization, Generalized maximum entropy estimation, OSGA: a fast subgradient algorithm with optimal complexity, Stochastic intermediate gradient method for convex optimization problems, Inexact coordinate descent: complexity and preconditioning, Sequential Subspace Optimization for Quasar-Convex Optimization Problems with Inexact Gradient, A flexible coordinate descent method, On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients, Exact gradient methods with memory, New results on subgradient methods for strongly convex optimization problems with a unified analysis, A frequency-domain analysis of inexact gradient methods, Global convergence rate analysis of unconstrained optimization methods based on probabilistic models, Analysis and Design of Optimization Algorithms via Integral Quadratic Constraints, On the convergence rate of scaled gradient projection method, Certification aspects of the fast gradient method for solving the dual of parametric convex programs, Generalized mirror prox algorithm for monotone variational inequalities: Universality and inexact oracle, A New Homotopy Proximal Variable-Metric Framework for Composite Convex Minimization, Accelerated Extra-Gradient Descent: A Novel Accelerated First-Order Method, An Accelerated Method for Derivative-Free Smooth Stochastic Convex Optimization, From Infinite to Finite Programs: Explicit Error Bounds with Applications to Approximate Dynamic Programming, Exact worst-case convergence rates of the proximal gradient method for composite convex minimization, Fast proximal algorithms for nonsmooth convex optimization, Robust hybrid zero-order optimization algorithms with acceleration via averaging in time, Statistical Query Algorithms for Mean Vector Estimation and Stochastic Convex Optimization, Distributed optimal coordination for multiple heterogeneous Euler-Lagrangian systems, Inexact first-order primal-dual algorithms, Scheduled Restart Momentum for Accelerated Stochastic Gradient Descent, Fast gradient methods for uniformly convex and weakly smooth problems, An optimal subgradient algorithm with subspace search for costly convex optimization problems, A new convergence analysis and perturbation resilience of some accelerated proximal forward–backward algorithms with errors, Convergence rates of accelerated proximal gradient algorithms under independent noise, Rate of convergence analysis of discretization and smoothing algorithms for semiinfinite minimax problems, Accelerated randomized mirror descent algorithms for composite non-strongly convex optimization, Optimal subgradient algorithms for large-scale convex optimization in simple domains, Accelerated gradient boosting, A primal majorized semismooth Newton-CG augmented Lagrangian method for large-scale linearly constrained convex programming, Achieving Geometric Convergence for Distributed Optimization Over Time-Varying Graphs, CGIHT: conjugate gradient iterative hard thresholding for compressed sensing and matrix completion, Worst-Case Convergence Analysis of Inexact Gradient and Newton Methods Through Semidefinite Programming Performance Estimation, Efficient first-order methods for convex minimization: a constructive approach, A heuristic adaptive fast gradient method in stochastic optimization problems, Augmented Lagrangian optimization under fixed-point arithmetic, Optimal Affine-Invariant Smooth Minimization Algorithms, Accelerated methods for saddle-point problem, Conditional gradient type methods for composite nonlinear and stochastic optimization, HT-AWGM: a hierarchical Tucker-adaptive wavelet Galerkin method for high-dimensional elliptic problems, Online First-Order Framework for Robust Convex Optimization, Stochastic Model-Based Minimization of Weakly Convex Functions, Optimization for deep learning: an overview, Complexity of first-order inexact Lagrangian and penalty methods for conic convex programming, Lower complexity bounds of first-order methods for convex-concave bilinear saddle-point problems, A universal modification of the linear coupling method, Management of a hydropower system via convex duality, Structured nonconvex and nonsmooth optimization: algorithms and iteration complexity analysis, Universal gradient methods for convex optimization problems, Composite convex optimization with global and local inexact oracles, Frank-Wolfe and friends: a journey into projection-free first-order optimization methods, Universal method for stochastic composite optimization problems, Variable Projection for NonSmooth Problems, Unnamed Item, Efficient Search of First-Order Nash Equilibria in Nonconvex-Concave Smooth Min-Max Problems, Adaptive inexact fast augmented Lagrangian methods for constrained convex optimization, Solving structured nonsmooth convex optimization with complexity \(\mathcal {O}(\varepsilon ^{-1/2})\), Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice, Analysis of biased stochastic gradient descent using sequential semidefinite programs, Fast gradient descent for convex minimization problems with an oracle producing a \(( \delta, L)\)-model of function at the requested point, Analysis of Optimization Algorithms via Integral Quadratic Constraints: Nonstrongly Convex Problems, Bounds for the tracking error of first-order online optimization methods, General convergence analysis of stochastic first-order methods for composite optimization, Stochastic intermediate gradient method for convex problems with stochastic inexact oracle, Complexity Certifications of First-Order Inexact Lagrangian Methods for General Convex Programming: Application to Real-Time MPC, Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization, New analysis and results for the Frank-Wolfe method, Bundle-level type methods uniformly optimal for smooth and nonsmooth convex optimization, Flexible low-rank statistical modeling with missing data and side information, Decentralized and parallel primal and dual accelerated methods for stochastic convex programming problems, Analogues of Switching Subgradient Schemes for Relatively Lipschitz-Continuous Convex Programming Problems, On the Nonergodic Convergence Rate of an Inexact Augmented Lagrangian Framework for Composite Convex Programming, Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity, A family of subgradient-based methods for convex optimization problems in a unifying framework, Generalized uniformly optimal methods for nonlinear programming, A Subgradient Method for Free Material Design, Harder, Better, Faster, Stronger Convergence Rates for Least-Squares Regression, Computing the Best Approximation over the Intersection of a Polyhedral Set and the Doubly Nonnegative Cone, Variable Projection for NonSmooth Problems, On the resolution of misspecified convex optimization and monotone variational inequality problems, Efficiency of minimizing compositions of convex functions and smooth maps, Accelerated Iterative Regularization via Dual Diagonal Descent, Robust Accelerated Gradient Methods for Smooth Strongly Convex Functions, Convergence Analysis of Inexact Randomized Iterative Methods, Unnamed Item, A dual approach for optimal algorithms in distributed optimization over networks, Inexact model: a framework for optimization and variational inequalities, Universal intermediate gradient method for convex problems with inexact oracle, Smoothed Variable Sample-Size Accelerated Proximal Methods for Nonsmooth Stochastic Convex Programs, Implicit regularization with strongly convex bias: Stability and acceleration, An inexact dual fast gradient-projection method for separable convex optimization with linear coupled constraints, Distributed optimization with inexact oracle, Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization, An accelerated first-order method for non-convex optimization on manifolds, A nonlinear conjugate gradient method using inexact first-order information, Stopping rules for gradient methods for non-convex problems with additive noise in gradient, Universal Conditional Gradient Sliding for Convex Optimization, Primal-dual \(\varepsilon\)-subgradient method for distributed optimization, Accelerated gradient methods with absolute and relative noise in the gradient, Gradient-Type Methods for Optimization Problems with Polyak-Łojasiewicz Condition: Early Stopping and Adaptivity to Inexactness Parameter, Hyperfast second-order local solvers for efficient statistically preconditioned distributed optimization, Optimal Algorithms for Stochastic Complementary Composite Minimization, Decentralized saddle-point problems with different constants of strong convexity and strong concavity, Iteration-Complexity of First-Order Augmented Lagrangian Methods for Convex Conic Programming, Adaptive constraint satisfaction for Markov decision process congestion games: application to transportation networks, Principled analyses and design of first-order methods with inexact proximal operators, Some adaptive first-order methods for variational inequalities with relatively strongly monotone operators and generalized smoothness, First-order methods for convex optimization, Differentially private inference via noisy optimization, Unifying framework for accelerated randomized methods in convex optimization, Utility/privacy trade-off as regularized optimal transport, Algorithms with gradient clipping for stochastic optimization with heavy-tailed noise, Towards accelerated rates for distributed optimization over time-varying networks, Recent theoretical advances in decentralized distributed convex optimization



Cites Work